Apr 28 19:16:38.533889 ip-10-0-143-22 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:16:38.963528 ip-10-0-143-22 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:38.963528 ip-10-0-143-22 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:16:38.963528 ip-10-0-143-22 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:38.963528 ip-10-0-143-22 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:16:38.963528 ip-10-0-143-22 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:38.966808 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.966691 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:16:38.971811 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971794 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:38.971811 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971811 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971818 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971821 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971824 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971827 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971830 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971834 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971837 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971848 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971852 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971856 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971860 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971863 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971869 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971872 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971875 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971878 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971883 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971886 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:38.971887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971889 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971892 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971895 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971898 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971901 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971904 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971907 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971912 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971915 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971917 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971921 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971923 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971926 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971928 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971932 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971935 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971938 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971940 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971943 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971948 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:38.972359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971951 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971954 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971965 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971968 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971971 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971973 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971976 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971978 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971981 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971984 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971986 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971991 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971994 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.971996 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972000 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972003 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972006 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972009 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972011 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:38.972866 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972014 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972017 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972020 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972022 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972025 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972031 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972034 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972036 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972039 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972042 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972044 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972047 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972050 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972052 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972055 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972058 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972062 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972065 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972070 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972073 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:38.973336 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972078 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972080 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972083 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972086 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972089 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972091 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972094 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972743 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972750 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972753 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972756 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972758 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972761 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972764 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972767 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972769 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972772 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972775 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972780 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972784 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:38.973848 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972787 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972789 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972792 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972795 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972797 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972800 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972803 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972805 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972809 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972811 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972816 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972819 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972822 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972824 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972827 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972830 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972832 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972835 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972838 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972841 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:38.974365 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972844 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972847 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972852 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972855 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972858 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972860 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972864 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972867 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972870 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972873 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972875 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972880 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972884 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972887 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972892 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972895 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972897 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972900 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972902 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:38.974923 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972905 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972908 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972911 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972913 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972916 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972919 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972921 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972924 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972929 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972932 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972934 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972937 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972940 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972942 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972944 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972947 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972950 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972952 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972955 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972957 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:38.975394 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972963 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972966 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972968 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972971 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972974 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972976 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972979 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972981 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972984 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972986 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972989 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972991 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972996 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.972999 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974789 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974821 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974832 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974837 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974842 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974846 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974851 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:16:38.975896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974855 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974859 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974862 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974865 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974869 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974872 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974875 2578 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974878 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974881 2578 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974884 2578 flags.go:64] FLAG: --cloud-config="" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974887 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974890 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974895 2578 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974898 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974901 2578 flags.go:64] FLAG: --config-dir="" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974904 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974909 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974913 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974919 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974923 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974927 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974930 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974935 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974940 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974943 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:16:38.976401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974946 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974955 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974959 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974964 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974967 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974972 2578 flags.go:64] FLAG: --enable-server="true" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974977 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974984 2578 flags.go:64] FLAG: --event-burst="100" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974987 2578 flags.go:64] FLAG: --event-qps="50" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974992 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974995 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.974998 2578 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975012 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975015 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975018 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975023 2578 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975026 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975030 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975034 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975038 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975041 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975045 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975048 2578 flags.go:64] FLAG: --feature-gates="" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975052 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975054 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:16:38.977027 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975058 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975061 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975064 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975067 2578 flags.go:64] FLAG: --help="false" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975070 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-143-22.ec2.internal" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975073 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975076 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975079 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975083 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975087 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975090 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975093 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975096 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975099 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975102 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975105 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975108 2578 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975111 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975113 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975116 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975119 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975122 2578 flags.go:64] FLAG: --lock-file="" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975125 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975128 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:16:38.977617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975131 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975136 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975139 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975142 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975145 2578 flags.go:64] FLAG: --logging-format="text" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975148 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975151 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975154 2578 flags.go:64] FLAG: --manifest-url="" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975157 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975162 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975165 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975169 2578 flags.go:64] FLAG: --max-pods="110" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975172 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975175 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975177 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975181 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975185 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975188 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975192 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975202 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975205 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975208 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975211 2578 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:16:38.978203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975214 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975219 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975222 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975225 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975228 2578 flags.go:64] FLAG: --port="10250" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975231 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975234 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ea793de61ccefc7d" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975237 2578 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975240 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975243 2578 flags.go:64] FLAG: --register-node="true" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975246 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975249 2578 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975253 2578 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975256 2578 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975258 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975261 2578 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975265 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975268 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975271 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975274 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975276 2578 flags.go:64] FLAG: --runonce="false" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975279 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975282 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975285 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975287 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975290 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:16:38.978772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975294 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975298 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975301 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975305 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975308 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975311 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975314 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975316 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975319 2578 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975322 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975328 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975331 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975334 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975338 2578 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975341 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975344 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975347 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975350 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975353 2578 flags.go:64] FLAG: --v="2" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975357 2578 flags.go:64] FLAG: --version="false" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975362 2578 flags.go:64] FLAG: --vmodule="" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975366 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.975370 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975474 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975478 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:38.979392 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975481 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975484 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975486 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975489 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975491 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975494 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975497 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975499 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975503 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975505 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975508 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975511 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975514 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975517 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975519 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975522 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975524 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975527 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975529 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:38.980011 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975533 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975537 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975539 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975543 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975546 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975549 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975553 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975556 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975558 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975561 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975564 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975567 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975569 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975572 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975574 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975577 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975580 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975582 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975585 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:38.980524 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975588 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975591 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975594 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975597 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975599 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975602 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975605 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975608 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975610 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975613 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975615 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975618 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975620 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975623 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975641 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975645 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975647 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975650 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975653 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975655 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:38.981020 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975658 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975660 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975663 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975666 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975668 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975671 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975673 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975676 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975678 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975681 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975683 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975686 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975688 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975691 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975694 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975696 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975699 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975701 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975704 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975707 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:38.981511 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975709 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:38.982071 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975712 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:38.982071 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975715 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:38.982071 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975717 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:38.982071 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975719 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:38.982071 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.975722 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:38.982071 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.976424 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:38.984213 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.984190 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:16:38.984213 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.984213 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984284 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984293 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984297 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984303 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984306 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984310 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984314 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984318 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984322 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984326 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984330 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984334 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984341 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984347 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984352 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984356 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984361 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984365 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984369 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:38.984366 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984373 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984377 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984381 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984386 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984390 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984394 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984398 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984402 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984406 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984410 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984414 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984418 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984423 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984427 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984432 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984436 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984440 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984444 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984448 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984452 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:38.985266 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984456 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984459 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984464 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984468 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984472 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984476 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984480 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984485 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984489 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984493 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984497 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984501 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984505 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984509 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984513 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984517 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984521 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984525 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984530 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984534 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:38.985882 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984538 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984542 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984545 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984549 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984553 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984559 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984563 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984569 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984575 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984581 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984586 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984590 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984594 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984598 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984603 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984607 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984611 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984615 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984619 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:38.986552 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984624 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984647 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984652 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984656 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984660 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984664 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984669 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984673 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.984681 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984846 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984855 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984859 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984864 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984868 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984872 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:38.987612 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984877 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984881 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984885 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984889 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984894 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984899 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984903 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984907 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984911 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984915 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984919 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984923 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984927 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984931 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984936 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984940 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984944 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984949 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984953 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984956 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:38.988104 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984960 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984964 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984968 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984973 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984977 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984981 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984985 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984989 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984994 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.984997 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985001 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985005 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985009 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985013 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985017 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985021 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985025 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985033 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985039 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:38.988739 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985044 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985048 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985052 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985057 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985062 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985067 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985071 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985076 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985080 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985084 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985088 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985093 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985097 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985100 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985104 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985108 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985113 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985117 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985122 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:38.989199 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985126 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985130 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985134 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985138 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985142 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985145 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985150 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985154 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985158 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985162 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985166 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985170 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985175 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985179 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985183 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985187 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985191 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985195 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985200 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985205 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:38.989745 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985210 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:38.990220 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:38.985215 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:38.990220 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.985223 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:38.990220 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.985395 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:16:38.990220 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.988409 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:16:38.990220 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.989350 2578 server.go:1019] "Starting client certificate rotation" Apr 28 19:16:38.990220 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.989453 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:38.990220 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:38.989485 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:39.014100 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.014068 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:39.018577 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.018551 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:39.035384 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.035362 2578 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:16:39.041112 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.041095 2578 log.go:25] "Validated CRI v1 image API" Apr 28 19:16:39.042274 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.042256 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:16:39.046351 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.046328 2578 fs.go:135] Filesystem UUIDs: map[3acd5adc-1be2-448e-9d2c-a4b0bdced8fc:/dev/nvme0n1p4 629ed0ce-704c-43c8-a1c2-d182b680e2af:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 28 19:16:39.046438 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.046350 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:16:39.052303 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.052187 2578 manager.go:217] Machine: {Timestamp:2026-04-28 19:16:39.050272991 +0000 UTC m=+0.400714388 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100741 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27fd50c6132f987ee7a11ac7810dac SystemUUID:ec27fd50-c613-2f98-7ee7-a11ac7810dac BootID:b37e6b37-d86a-4a44-8dc5-d002a593a54a Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:aa:b1:c7:a3:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:aa:b1:c7:a3:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:d7:b4:26:1b:80 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:16:39.052303 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.052292 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:16:39.052449 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.052411 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:16:39.053695 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.053655 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:16:39.054340 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.054317 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:39.054340 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.053700 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-22.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:16:39.054421 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.054360 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:16:39.054421 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.054373 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:16:39.054421 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.054391 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:39.055563 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.055551 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:39.056765 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.056755 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:39.056877 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.056868 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:16:39.059159 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.059149 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:16:39.059193 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.059163 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:16:39.059193 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.059177 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:16:39.059193 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.059186 2578 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:16:39.059284 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.059200 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:16:39.060305 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.060294 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:39.060349 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.060313 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:39.063274 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.063254 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:16:39.064582 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.064568 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:16:39.066089 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066073 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066093 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066099 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066106 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066112 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066119 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066125 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066131 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066139 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066145 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:16:39.066167 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066166 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:16:39.066447 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.066176 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:16:39.067197 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.067187 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:16:39.067232 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.067198 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:16:39.070997 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.070984 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:16:39.071059 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.071022 2578 server.go:1295] "Started kubelet" Apr 28 19:16:39.071133 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.071107 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:16:39.071180 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.071117 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:16:39.071220 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.071185 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:16:39.071944 ip-10-0-143-22 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:16:39.072211 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.072188 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:16:39.072553 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.072447 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-22.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:16:39.074420 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.074371 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:16:39.074506 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.074371 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:16:39.075314 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.075292 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:16:39.081082 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.081061 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:39.081487 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.081467 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:16:39.081616 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.081604 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:16:39.081968 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.080962 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-22.ec2.internal.18aa9b56920e79e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-22.ec2.internal,UID:ip-10-0-143-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-22.ec2.internal,},FirstTimestamp:2026-04-28 19:16:39.070996963 +0000 UTC m=+0.421438360,LastTimestamp:2026-04-28 19:16:39.070996963 +0000 UTC m=+0.421438360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-22.ec2.internal,}" Apr 28 19:16:39.082455 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.082433 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:16:39.082455 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.082435 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:16:39.082577 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.082458 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.082577 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.082475 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:16:39.082577 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.082553 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:16:39.082577 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.082563 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:16:39.083161 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.083147 2578 factory.go:55] Registering systemd factory Apr 28 19:16:39.083243 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.083198 2578 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:16:39.083389 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.083376 2578 factory.go:153] Registering CRI-O factory Apr 28 19:16:39.083443 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.083392 2578 factory.go:223] Registration of the crio container factory successfully Apr 28 19:16:39.083443 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.083441 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:16:39.083538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.083468 2578 factory.go:103] Registering Raw factory Apr 28 19:16:39.083538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.083483 2578 manager.go:1196] Started watching for new ooms in manager Apr 28 19:16:39.084322 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.084308 2578 manager.go:319] Starting recovery of all containers Apr 28 19:16:39.091010 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.090974 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:16:39.091123 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.091100 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:16:39.094303 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.094159 2578 manager.go:324] Recovery completed Apr 28 19:16:39.098899 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.098887 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:39.101172 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.101147 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:39.101231 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.101186 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:39.101231 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.101196 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:39.101711 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.101699 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:16:39.101779 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.101712 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:16:39.101779 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.101729 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:39.103451 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.103388 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-22.ec2.internal.18aa9b5693dae937 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-22.ec2.internal,UID:ip-10-0-143-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-22.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-22.ec2.internal,},FirstTimestamp:2026-04-28 19:16:39.101172023 +0000 UTC m=+0.451613422,LastTimestamp:2026-04-28 19:16:39.101172023 +0000 UTC m=+0.451613422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-22.ec2.internal,}" Apr 28 19:16:39.103704 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.103691 2578 policy_none.go:49] "None policy: Start" Apr 28 19:16:39.103765 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.103710 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:16:39.103765 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.103723 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:16:39.108878 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.108859 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v25bs" Apr 28 19:16:39.114401 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.114335 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-22.ec2.internal.18aa9b5693db32ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-22.ec2.internal,UID:ip-10-0-143-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-143-22.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-143-22.ec2.internal,},FirstTimestamp:2026-04-28 19:16:39.101190842 +0000 UTC m=+0.451632239,LastTimestamp:2026-04-28 19:16:39.101190842 +0000 UTC m=+0.451632239,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-22.ec2.internal,}" Apr 28 19:16:39.117857 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.117841 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v25bs" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.145213 2578 manager.go:341] "Starting Device Plugin manager" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.145241 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.145250 2578 server.go:85] "Starting device plugin registration server" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.145489 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.145502 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.145594 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.145734 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.145749 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.146315 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:16:39.157072 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.146353 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.189181 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.189148 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:16:39.190509 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.190489 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:16:39.190617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.190518 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:16:39.190617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.190537 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:16:39.190617 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.190544 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:16:39.190617 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.190575 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:16:39.193520 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.193502 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:39.246177 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.246094 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:39.247365 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.247348 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:39.247448 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.247381 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:39.247448 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.247392 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:39.247448 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.247418 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.256005 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.255988 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.256047 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.256011 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-22.ec2.internal\": node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.279843 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.279824 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.291168 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.291149 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal"] Apr 28 19:16:39.291232 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.291223 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:39.292117 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.292101 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:39.292186 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.292132 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:39.292186 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.292144 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:39.293130 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.293119 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:39.293280 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.293266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.293315 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.293295 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:39.296133 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.296117 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:39.296209 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.296122 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:39.296209 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.296181 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:39.296209 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.296149 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:39.296328 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.296199 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:39.296328 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.296223 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:39.297258 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.297246 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.297295 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.297271 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:39.297919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.297899 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:39.297997 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.297929 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:39.297997 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.297939 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:39.327267 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.327246 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-22.ec2.internal\" not found" node="ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.331581 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.331562 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-22.ec2.internal\" not found" node="ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.380222 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.380198 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.481250 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.481203 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.484566 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.484548 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/530700f112f77c890973fb51d737f28d-config\") pod \"kube-apiserver-proxy-ip-10-0-143-22.ec2.internal\" (UID: \"530700f112f77c890973fb51d737f28d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.484652 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.484575 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ce95a4f961ae9992c062db4f234920c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal\" (UID: \"7ce95a4f961ae9992c062db4f234920c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.484652 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.484595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ce95a4f961ae9992c062db4f234920c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal\" (UID: \"7ce95a4f961ae9992c062db4f234920c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.581991 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.581919 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.585152 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.585139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ce95a4f961ae9992c062db4f234920c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal\" (UID: \"7ce95a4f961ae9992c062db4f234920c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.585204 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.585161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/530700f112f77c890973fb51d737f28d-config\") pod \"kube-apiserver-proxy-ip-10-0-143-22.ec2.internal\" (UID: \"530700f112f77c890973fb51d737f28d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.585204 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.585180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ce95a4f961ae9992c062db4f234920c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal\" (UID: \"7ce95a4f961ae9992c062db4f234920c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.585265 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.585215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ce95a4f961ae9992c062db4f234920c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal\" (UID: \"7ce95a4f961ae9992c062db4f234920c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.585265 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.585242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ce95a4f961ae9992c062db4f234920c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal\" (UID: \"7ce95a4f961ae9992c062db4f234920c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.585326 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.585243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/530700f112f77c890973fb51d737f28d-config\") pod \"kube-apiserver-proxy-ip-10-0-143-22.ec2.internal\" (UID: \"530700f112f77c890973fb51d737f28d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.629309 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.629283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.633994 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.633964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:39.682864 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.682833 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.783306 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.783274 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.883726 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.883697 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.984253 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:39.984225 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:39.989518 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.989507 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:16:39.989652 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:39.989623 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:40.081813 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.081785 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:40.085180 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:40.085156 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:40.093783 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.093762 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:40.120330 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.120301 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:11:39 +0000 UTC" deadline="2028-01-18 23:46:41.72425218 +0000 UTC" Apr 28 19:16:40.120404 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.120329 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15124h30m1.603926779s" Apr 28 19:16:40.123726 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.123708 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-smc95" Apr 28 19:16:40.127162 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:40.127131 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod530700f112f77c890973fb51d737f28d.slice/crio-fa2c42055631c2533dfb7b4c5917bcabf116759d0729e8df91cd5bd25f8058ae WatchSource:0}: Error finding container fa2c42055631c2533dfb7b4c5917bcabf116759d0729e8df91cd5bd25f8058ae: Status 404 returned error can't find the container with id fa2c42055631c2533dfb7b4c5917bcabf116759d0729e8df91cd5bd25f8058ae Apr 28 19:16:40.127694 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:40.127677 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce95a4f961ae9992c062db4f234920c.slice/crio-1951e354b5a005b00e1399035e660e6cc712ad1b83f9a9b132eaa95bddf96d15 WatchSource:0}: Error finding container 1951e354b5a005b00e1399035e660e6cc712ad1b83f9a9b132eaa95bddf96d15: Status 404 returned error can't find the container with id 1951e354b5a005b00e1399035e660e6cc712ad1b83f9a9b132eaa95bddf96d15 Apr 28 19:16:40.132206 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.132192 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:16:40.133224 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.133206 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-smc95" Apr 28 19:16:40.139592 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.139575 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:40.185611 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:40.185560 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:40.193910 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.193853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" event={"ID":"530700f112f77c890973fb51d737f28d","Type":"ContainerStarted","Data":"fa2c42055631c2533dfb7b4c5917bcabf116759d0729e8df91cd5bd25f8058ae"} Apr 28 19:16:40.194764 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.194740 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" event={"ID":"7ce95a4f961ae9992c062db4f234920c","Type":"ContainerStarted","Data":"1951e354b5a005b00e1399035e660e6cc712ad1b83f9a9b132eaa95bddf96d15"} Apr 28 19:16:40.286274 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:40.286224 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:40.386783 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:40.386701 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-22.ec2.internal\" not found" Apr 28 19:16:40.391746 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.391731 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:40.481939 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.481899 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" Apr 28 19:16:40.493140 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.493116 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:40.494071 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.494042 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" Apr 28 19:16:40.508053 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.508028 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:40.551036 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:40.551012 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:41.008125 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.008095 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:41.060777 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.060739 2578 apiserver.go:52] "Watching apiserver" Apr 28 19:16:41.071852 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.071824 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:16:41.073397 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.073370 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bgmp8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal","openshift-multus/multus-additional-cni-plugins-v4wsc","openshift-multus/network-metrics-daemon-2ssxm","openshift-network-operator/iptables-alerter-jgrh8","kube-system/konnectivity-agent-p5qkw","openshift-image-registry/node-ca-hp74n","openshift-multus/multus-m4ddb","openshift-network-diagnostics/network-check-target-k9zr5","openshift-ovn-kubernetes/ovnkube-node-ssm92","kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws","openshift-cluster-node-tuning-operator/tuned-sc7j5"] Apr 28 19:16:41.075477 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.075453 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.076669 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.076627 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.078434 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.078406 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:16:41.078852 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.078830 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:16:41.078936 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.078889 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:41.079022 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.078970 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:41.080070 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.080050 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.080978 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.080958 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lfw5v\"" Apr 28 19:16:41.081069 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.080982 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:16:41.081124 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.081104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:16:41.081124 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.081114 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:16:41.081242 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.081184 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vjt94\"" Apr 28 19:16:41.082372 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.082353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.083584 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.083565 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.083761 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.083725 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:16:41.084003 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.083986 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:16:41.084083 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.084006 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:16:41.084133 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.084095 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:16:41.084338 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.084321 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-bn9qj\"" Apr 28 19:16:41.084431 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.084378 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:16:41.084581 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.084564 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:41.084766 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.084565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ffmqm\"" Apr 28 19:16:41.084916 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.084899 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:16:41.085002 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.084985 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:41.085225 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.085209 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:41.085329 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.085276 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:41.085382 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.085366 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.085431 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.085421 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:16:41.086661 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.086624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.086763 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.086705 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:16:41.086763 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.086745 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:16:41.086866 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.086804 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:16:41.087232 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.087214 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2t6hn\"" Apr 28 19:16:41.087842 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.087822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.089449 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.089429 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:16:41.089575 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.089560 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:16:41.089749 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.089730 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-knvt4\"" Apr 28 19:16:41.090728 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.090472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:16:41.090728 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.090591 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:16:41.091305 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.091069 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:16:41.091305 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.091081 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:41.091305 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.091093 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:41.091305 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.091209 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zzcp7\"" Apr 28 19:16:41.091695 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.091670 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:16:41.092092 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.092067 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-ftc49\"" Apr 28 19:16:41.092180 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.092093 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:16:41.093297 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-multus-certs\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.093485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:41.093485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093342 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2352f752-8d71-483d-9d43-b79ba63f8cad-iptables-alerter-script\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.093485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2352f752-8d71-483d-9d43-b79ba63f8cad-host-slash\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.093485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cnibin\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.093485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093413 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-hosts-file\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.093485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-system-cni-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.093485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093468 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-cnibin\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093490 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-os-release\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp9t7\" (UniqueName: \"kubernetes.io/projected/96593340-195c-4a9b-8d15-babb74ebf1c6-kube-api-access-pp9t7\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j9zk\" (UniqueName: \"kubernetes.io/projected/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-kube-api-access-6j9zk\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093556 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093564 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-socket-dir-parent\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-cni-multus\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-system-cni-dir\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-os-release\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2k7h\" (UniqueName: \"kubernetes.io/projected/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-kube-api-access-g2k7h\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-kubelet\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093766 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093804 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-etc-kubernetes\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/692b128d-82a4-4c26-b17d-0b4d804ef295-serviceca\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093864 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cni-binary-copy\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.093929 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-tmp-dir\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-k8s-cni-cncf-io\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-hostroot\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.093978 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs4wv\" (UniqueName: \"kubernetes.io/projected/6544e7a1-69d4-41e0-b18d-961cdaa5418d-kube-api-access-hs4wv\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmt9\" (UniqueName: \"kubernetes.io/projected/2352f752-8d71-483d-9d43-b79ba63f8cad-kube-api-access-bmmt9\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094038 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/88c3f56d-6859-4f8e-a645-45fb36262479-agent-certs\") pod \"konnectivity-agent-p5qkw\" (UID: \"88c3f56d-6859-4f8e-a645-45fb36262479\") " pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094082 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/88c3f56d-6859-4f8e-a645-45fb36262479-konnectivity-ca\") pod \"konnectivity-agent-p5qkw\" (UID: \"88c3f56d-6859-4f8e-a645-45fb36262479\") " pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6544e7a1-69d4-41e0-b18d-961cdaa5418d-cni-binary-copy\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094152 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/692b128d-82a4-4c26-b17d-0b4d804ef295-host\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9kp\" (UniqueName: \"kubernetes.io/projected/692b128d-82a4-4c26-b17d-0b4d804ef295-kube-api-access-ln9kp\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094226 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-cni-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094248 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-netns\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-cni-bin\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094303 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-conf-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094339 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-daemon-config\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.094802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.094623 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-rdqkb\"" Apr 28 19:16:41.095835 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.095814 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:16:41.134194 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.134172 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:40 +0000 UTC" deadline="2027-10-06 01:53:57.373056299 +0000 UTC" Apr 28 19:16:41.134194 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.134193 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12606h37m16.238865773s" Apr 28 19:16:41.183733 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.183705 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:16:41.194696 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-os-release\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.194811 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-slash\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.194811 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-systemd\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.194811 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-run\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.194811 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-socket-dir-parent\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.194972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194808 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-os-release\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.194972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-systemd-units\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.194972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-var-lib-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.194972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-socket-dir-parent\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.194972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194880 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-kubernetes\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.194972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-os-release\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.194972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-os-release\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.194972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.194967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2k7h\" (UniqueName: \"kubernetes.io/projected/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-kube-api-access-g2k7h\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195008 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-kubelet\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-var-lib-kubelet\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-host\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-etc-kubernetes\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-kubelet\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195182 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195195 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-etc-kubernetes\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovnkube-config\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.195266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195256 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-socket-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/692b128d-82a4-4c26-b17d-0b4d804ef295-serviceca\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cni-binary-copy\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-tmp-dir\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmt9\" (UniqueName: \"kubernetes.io/projected/2352f752-8d71-483d-9d43-b79ba63f8cad-kube-api-access-bmmt9\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/88c3f56d-6859-4f8e-a645-45fb36262479-agent-certs\") pod \"konnectivity-agent-p5qkw\" (UID: \"88c3f56d-6859-4f8e-a645-45fb36262479\") " pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/88c3f56d-6859-4f8e-a645-45fb36262479-konnectivity-ca\") pod \"konnectivity-agent-p5qkw\" (UID: \"88c3f56d-6859-4f8e-a645-45fb36262479\") " pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/692b128d-82a4-4c26-b17d-0b4d804ef295-host\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-netns\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-conf-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j26db\" (UniqueName: \"kubernetes.io/projected/6abb021c-2028-4afc-a02f-952af6060a13-kube-api-access-j26db\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-multus-certs\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-ovn\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovn-node-metrics-cert\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-cni-netd\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.195676 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195684 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2352f752-8d71-483d-9d43-b79ba63f8cad-iptables-alerter-script\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cnibin\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-hosts-file\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-node-log\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195774 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-etc-selinux\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/692b128d-82a4-4c26-b17d-0b4d804ef295-serviceca\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195810 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-modprobe-d\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysctl-d\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp9t7\" (UniqueName: \"kubernetes.io/projected/96593340-195c-4a9b-8d15-babb74ebf1c6-kube-api-access-pp9t7\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j9zk\" (UniqueName: \"kubernetes.io/projected/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-kube-api-access-6j9zk\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-cni-multus\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cni-binary-copy\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195956 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-systemd\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-tuned\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.195992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9a9c874-a655-4f8e-9492-86265496a4e7-tmp\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196039 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-system-cni-dir\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.196285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196065 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-kubelet\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196087 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-cni-bin\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196102 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-env-overrides\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-registration-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-tmp-dir\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt628\" (UniqueName: \"kubernetes.io/projected/b9a9c874-a655-4f8e-9492-86265496a4e7-kube-api-access-kt628\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196193 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-device-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196243 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-sys-fs\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196262 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysctl-conf\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196280 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-k8s-cni-cncf-io\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-hostroot\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs4wv\" (UniqueName: \"kubernetes.io/projected/6544e7a1-69d4-41e0-b18d-961cdaa5418d-kube-api-access-hs4wv\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196357 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6544e7a1-69d4-41e0-b18d-961cdaa5418d-cni-binary-copy\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.197047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-etc-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-lib-modules\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9kp\" (UniqueName: \"kubernetes.io/projected/692b128d-82a4-4c26-b17d-0b4d804ef295-kube-api-access-ln9kp\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-cni-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-cni-bin\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-daemon-config\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-run-netns\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovnkube-script-lib\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196645 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysconfig\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-log-socket\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wr2h\" (UniqueName: \"kubernetes.io/projected/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-kube-api-access-7wr2h\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-sys\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-run-ovn-kubernetes\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2352f752-8d71-483d-9d43-b79ba63f8cad-host-slash\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.197757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-system-cni-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-cnibin\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/88c3f56d-6859-4f8e-a645-45fb36262479-konnectivity-ca\") pod \"konnectivity-agent-p5qkw\" (UID: \"88c3f56d-6859-4f8e-a645-45fb36262479\") " pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-cnibin\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/692b128d-82a4-4c26-b17d-0b4d804ef295-host\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197035 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-system-cni-dir\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-netns\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.196084 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-cni-multus\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.197092 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.197169 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:41.697139344 +0000 UTC m=+3.047580736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-conf-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-multus-certs\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197494 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-hosts-file\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197517 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-cnibin\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-var-lib-cni-bin\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2352f752-8d71-483d-9d43-b79ba63f8cad-iptables-alerter-script\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-hostroot\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.198371 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197790 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-cni-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.199158 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.197810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-host-run-k8s-cni-cncf-io\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.199158 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.198169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6544e7a1-69d4-41e0-b18d-961cdaa5418d-cni-binary-copy\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.199158 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.198169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6544e7a1-69d4-41e0-b18d-961cdaa5418d-system-cni-dir\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.199158 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.198246 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2352f752-8d71-483d-9d43-b79ba63f8cad-host-slash\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.199158 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.198260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.199158 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.198584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.199158 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.198799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6544e7a1-69d4-41e0-b18d-961cdaa5418d-multus-daemon-config\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.200492 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.200472 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/88c3f56d-6859-4f8e-a645-45fb36262479-agent-certs\") pod \"konnectivity-agent-p5qkw\" (UID: \"88c3f56d-6859-4f8e-a645-45fb36262479\") " pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:16:41.209117 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.209097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2k7h\" (UniqueName: \"kubernetes.io/projected/d18eaae1-d122-4fa3-8b2e-ffc7868bfd03-kube-api-access-g2k7h\") pod \"multus-additional-cni-plugins-v4wsc\" (UID: \"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03\") " pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.214721 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.214672 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:41.214721 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.214703 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:41.214876 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.214726 2578 projected.go:194] Error preparing data for projected volume kube-api-access-gs2xn for pod openshift-network-diagnostics/network-check-target-k9zr5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:41.214876 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.214833 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn podName:27090a69-2cdb-4eae-a82d-5fa7351f8654 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:41.71481047 +0000 UTC m=+3.065251872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gs2xn" (UniqueName: "kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn") pod "network-check-target-k9zr5" (UID: "27090a69-2cdb-4eae-a82d-5fa7351f8654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:41.216045 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.215519 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmt9\" (UniqueName: \"kubernetes.io/projected/2352f752-8d71-483d-9d43-b79ba63f8cad-kube-api-access-bmmt9\") pod \"iptables-alerter-jgrh8\" (UID: \"2352f752-8d71-483d-9d43-b79ba63f8cad\") " pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.216045 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.216006 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp9t7\" (UniqueName: \"kubernetes.io/projected/96593340-195c-4a9b-8d15-babb74ebf1c6-kube-api-access-pp9t7\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:41.216045 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.216037 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs4wv\" (UniqueName: \"kubernetes.io/projected/6544e7a1-69d4-41e0-b18d-961cdaa5418d-kube-api-access-hs4wv\") pod \"multus-m4ddb\" (UID: \"6544e7a1-69d4-41e0-b18d-961cdaa5418d\") " pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.216564 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.216541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9kp\" (UniqueName: \"kubernetes.io/projected/692b128d-82a4-4c26-b17d-0b4d804ef295-kube-api-access-ln9kp\") pod \"node-ca-hp74n\" (UID: \"692b128d-82a4-4c26-b17d-0b4d804ef295\") " pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.216762 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.216738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j9zk\" (UniqueName: \"kubernetes.io/projected/d5c1a9d5-7a1d-4369-837a-3ed96d5f107f-kube-api-access-6j9zk\") pod \"node-resolver-bgmp8\" (UID: \"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f\") " pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.297363 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-cni-netd\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.297363 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-node-log\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.297363 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-etc-selinux\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-node-log\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-cni-netd\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297432 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-modprobe-d\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-etc-selinux\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysctl-d\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-systemd\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-modprobe-d\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-tuned\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9a9c874-a655-4f8e-9492-86265496a4e7-tmp\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-kubelet\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.297615 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-cni-bin\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-env-overrides\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-registration-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt628\" (UniqueName: \"kubernetes.io/projected/b9a9c874-a655-4f8e-9492-86265496a4e7-kube-api-access-kt628\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysctl-d\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-device-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-sys-fs\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysctl-conf\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-etc-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-lib-modules\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-run-netns\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovnkube-script-lib\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297982 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysconfig\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-cni-bin\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-log-socket\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-kubelet\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wr2h\" (UniqueName: \"kubernetes.io/projected/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-kube-api-access-7wr2h\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.297584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-systemd\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-device-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-sys\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysctl-conf\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-run-ovn-kubernetes\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298201 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-slash\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298207 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-systemd\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-lib-modules\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-run\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298305 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-systemd-units\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-run-netns\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-var-lib-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-kubernetes\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-sysconfig\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-registration-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.298919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-var-lib-kubelet\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-slash\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-kubernetes\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-etc-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-log-socket\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-systemd-units\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-host\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-var-lib-kubelet\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovnkube-config\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-run\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298469 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-systemd\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298657 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-var-lib-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-host\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-env-overrides\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-openvswitch\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-host-run-ovn-kubernetes\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-sys-fs\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.299644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-socket-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9a9c874-a655-4f8e-9492-86265496a4e7-sys\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298917 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-socket-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.298954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j26db\" (UniqueName: \"kubernetes.io/projected/6abb021c-2028-4afc-a02f-952af6060a13-kube-api-access-j26db\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.299003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-ovn\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.299055 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovn-node-metrics-cert\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.299068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovnkube-config\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.299065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-run-ovn\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.299108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.299256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6abb021c-2028-4afc-a02f-952af6060a13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.299502 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovnkube-script-lib\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.300383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9a9c874-a655-4f8e-9492-86265496a4e7-tmp\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.300464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.300410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9a9c874-a655-4f8e-9492-86265496a4e7-etc-tuned\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.301514 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.301497 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-ovn-node-metrics-cert\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.308576 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.308555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt628\" (UniqueName: \"kubernetes.io/projected/b9a9c874-a655-4f8e-9492-86265496a4e7-kube-api-access-kt628\") pod \"tuned-sc7j5\" (UID: \"b9a9c874-a655-4f8e-9492-86265496a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.309508 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.309492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j26db\" (UniqueName: \"kubernetes.io/projected/6abb021c-2028-4afc-a02f-952af6060a13-kube-api-access-j26db\") pod \"aws-ebs-csi-driver-node-mtzws\" (UID: \"6abb021c-2028-4afc-a02f-952af6060a13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.311561 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.311542 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wr2h\" (UniqueName: \"kubernetes.io/projected/1fe57666-24f8-4a83-ae5a-59f5b12c7a9e-kube-api-access-7wr2h\") pod \"ovnkube-node-ssm92\" (UID: \"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.390368 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.390333 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hp74n" Apr 28 19:16:41.399538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.399517 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:16:41.408092 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.408073 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" Apr 28 19:16:41.415668 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.415648 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jgrh8" Apr 28 19:16:41.423103 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.423083 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bgmp8" Apr 28 19:16:41.430705 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.430671 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m4ddb" Apr 28 19:16:41.436952 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.436933 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:16:41.446453 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.446438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" Apr 28 19:16:41.450990 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.450970 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" Apr 28 19:16:41.702031 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.701977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:41.702221 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.702151 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:41.702291 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.702231 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:42.702214482 +0000 UTC m=+4.052655868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:41.780066 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.779951 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fe57666_24f8_4a83_ae5a_59f5b12c7a9e.slice/crio-776e72458360bbba4fdf187ef75ad21944279ba6fd7a15365bbe0d3d4a13140a WatchSource:0}: Error finding container 776e72458360bbba4fdf187ef75ad21944279ba6fd7a15365bbe0d3d4a13140a: Status 404 returned error can't find the container with id 776e72458360bbba4fdf187ef75ad21944279ba6fd7a15365bbe0d3d4a13140a Apr 28 19:16:41.784191 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.784163 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6544e7a1_69d4_41e0_b18d_961cdaa5418d.slice/crio-877b4809d85bec5b75d5d875e5b56d3e7b701b1165b962f777be690d36b29c0f WatchSource:0}: Error finding container 877b4809d85bec5b75d5d875e5b56d3e7b701b1165b962f777be690d36b29c0f: Status 404 returned error can't find the container with id 877b4809d85bec5b75d5d875e5b56d3e7b701b1165b962f777be690d36b29c0f Apr 28 19:16:41.785529 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.785508 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2352f752_8d71_483d_9d43_b79ba63f8cad.slice/crio-8106f96a982f91473c0de5574b3dbae62329fc7ed663d48b92071b8be1bd1662 WatchSource:0}: Error finding container 8106f96a982f91473c0de5574b3dbae62329fc7ed663d48b92071b8be1bd1662: Status 404 returned error can't find the container with id 8106f96a982f91473c0de5574b3dbae62329fc7ed663d48b92071b8be1bd1662 Apr 28 19:16:41.788748 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.788727 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6abb021c_2028_4afc_a02f_952af6060a13.slice/crio-b2f8b545f0e1d3c6188923ac7e14784fc18786082a49307835b5e96c4ed85c32 WatchSource:0}: Error finding container b2f8b545f0e1d3c6188923ac7e14784fc18786082a49307835b5e96c4ed85c32: Status 404 returned error can't find the container with id b2f8b545f0e1d3c6188923ac7e14784fc18786082a49307835b5e96c4ed85c32 Apr 28 19:16:41.789769 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.789744 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod692b128d_82a4_4c26_b17d_0b4d804ef295.slice/crio-98890e87226ec975403421353fa9c29ceffbcf6586c94b79c4628be892f1bb59 WatchSource:0}: Error finding container 98890e87226ec975403421353fa9c29ceffbcf6586c94b79c4628be892f1bb59: Status 404 returned error can't find the container with id 98890e87226ec975403421353fa9c29ceffbcf6586c94b79c4628be892f1bb59 Apr 28 19:16:41.790515 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.790493 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88c3f56d_6859_4f8e_a645_45fb36262479.slice/crio-ba3d02ac5c0db8313c2660df41cb8ce2c5da640126e6fa14f09f598a6b09ee8e WatchSource:0}: Error finding container ba3d02ac5c0db8313c2660df41cb8ce2c5da640126e6fa14f09f598a6b09ee8e: Status 404 returned error can't find the container with id ba3d02ac5c0db8313c2660df41cb8ce2c5da640126e6fa14f09f598a6b09ee8e Apr 28 19:16:41.791698 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.791669 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9a9c874_a655_4f8e_9492_86265496a4e7.slice/crio-d30ac9625af7ee883ce7ca07d0d25581e2dd4cad193125c58c3075daf58910ff WatchSource:0}: Error finding container d30ac9625af7ee883ce7ca07d0d25581e2dd4cad193125c58c3075daf58910ff: Status 404 returned error can't find the container with id d30ac9625af7ee883ce7ca07d0d25581e2dd4cad193125c58c3075daf58910ff Apr 28 19:16:41.792692 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.792657 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18eaae1_d122_4fa3_8b2e_ffc7868bfd03.slice/crio-4b9e31c317443b3981faa4776b8c52267cd6f553f139bfac8bd5522a9c36a2a8 WatchSource:0}: Error finding container 4b9e31c317443b3981faa4776b8c52267cd6f553f139bfac8bd5522a9c36a2a8: Status 404 returned error can't find the container with id 4b9e31c317443b3981faa4776b8c52267cd6f553f139bfac8bd5522a9c36a2a8 Apr 28 19:16:41.795040 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:16:41.795018 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5c1a9d5_7a1d_4369_837a_3ed96d5f107f.slice/crio-67bdadaf175a177c23f19c83da014f26ab69f77d719b98026231f6254174868a WatchSource:0}: Error finding container 67bdadaf175a177c23f19c83da014f26ab69f77d719b98026231f6254174868a: Status 404 returned error can't find the container with id 67bdadaf175a177c23f19c83da014f26ab69f77d719b98026231f6254174868a Apr 28 19:16:41.803119 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:41.803091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:41.803258 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.803242 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:41.803300 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.803265 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:41.803300 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.803278 2578 projected.go:194] Error preparing data for projected volume kube-api-access-gs2xn for pod openshift-network-diagnostics/network-check-target-k9zr5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:41.803366 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:41.803339 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn podName:27090a69-2cdb-4eae-a82d-5fa7351f8654 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:42.803318319 +0000 UTC m=+4.153759704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs2xn" (UniqueName: "kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn") pod "network-check-target-k9zr5" (UID: "27090a69-2cdb-4eae-a82d-5fa7351f8654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:42.134597 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.134558 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:40 +0000 UTC" deadline="2027-10-15 21:50:21.126890983 +0000 UTC" Apr 28 19:16:42.134597 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.134594 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12842h33m38.99230121s" Apr 28 19:16:42.191195 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.191160 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:42.191359 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:42.191311 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:42.203450 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.203419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" event={"ID":"530700f112f77c890973fb51d737f28d","Type":"ContainerStarted","Data":"c109ffb819e9938aadb7648524e3a8bd8e196062c67a1c159937410c575f0b25"} Apr 28 19:16:42.208749 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.208718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bgmp8" event={"ID":"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f","Type":"ContainerStarted","Data":"67bdadaf175a177c23f19c83da014f26ab69f77d719b98026231f6254174868a"} Apr 28 19:16:42.214539 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.214416 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p5qkw" event={"ID":"88c3f56d-6859-4f8e-a645-45fb36262479","Type":"ContainerStarted","Data":"ba3d02ac5c0db8313c2660df41cb8ce2c5da640126e6fa14f09f598a6b09ee8e"} Apr 28 19:16:42.227806 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.227776 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" event={"ID":"6abb021c-2028-4afc-a02f-952af6060a13","Type":"ContainerStarted","Data":"b2f8b545f0e1d3c6188923ac7e14784fc18786082a49307835b5e96c4ed85c32"} Apr 28 19:16:42.231455 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.231363 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4ddb" event={"ID":"6544e7a1-69d4-41e0-b18d-961cdaa5418d","Type":"ContainerStarted","Data":"877b4809d85bec5b75d5d875e5b56d3e7b701b1165b962f777be690d36b29c0f"} Apr 28 19:16:42.238071 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.236814 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerStarted","Data":"4b9e31c317443b3981faa4776b8c52267cd6f553f139bfac8bd5522a9c36a2a8"} Apr 28 19:16:42.241072 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.240060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" event={"ID":"b9a9c874-a655-4f8e-9492-86265496a4e7","Type":"ContainerStarted","Data":"d30ac9625af7ee883ce7ca07d0d25581e2dd4cad193125c58c3075daf58910ff"} Apr 28 19:16:42.250345 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.250305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jgrh8" event={"ID":"2352f752-8d71-483d-9d43-b79ba63f8cad","Type":"ContainerStarted","Data":"8106f96a982f91473c0de5574b3dbae62329fc7ed663d48b92071b8be1bd1662"} Apr 28 19:16:42.257352 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.254011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hp74n" event={"ID":"692b128d-82a4-4c26-b17d-0b4d804ef295","Type":"ContainerStarted","Data":"98890e87226ec975403421353fa9c29ceffbcf6586c94b79c4628be892f1bb59"} Apr 28 19:16:42.257352 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.256437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"776e72458360bbba4fdf187ef75ad21944279ba6fd7a15365bbe0d3d4a13140a"} Apr 28 19:16:42.711067 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.710978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:42.711227 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:42.711173 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:42.711282 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:42.711236 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:44.711216702 +0000 UTC m=+6.061658088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:42.811620 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:42.811582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:42.811826 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:42.811768 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:42.811826 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:42.811790 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:42.811826 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:42.811804 2578 projected.go:194] Error preparing data for projected volume kube-api-access-gs2xn for pod openshift-network-diagnostics/network-check-target-k9zr5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:42.811980 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:42.811864 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn podName:27090a69-2cdb-4eae-a82d-5fa7351f8654 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:44.81184581 +0000 UTC m=+6.162287208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs2xn" (UniqueName: "kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn") pod "network-check-target-k9zr5" (UID: "27090a69-2cdb-4eae-a82d-5fa7351f8654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:43.191696 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:43.190999 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:43.191696 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:43.191134 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:43.265744 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:43.265213 2578 generic.go:358] "Generic (PLEG): container finished" podID="7ce95a4f961ae9992c062db4f234920c" containerID="8094352f897418a0681c80e0ad752acfb442b7e913c85820f24249b838df12cd" exitCode=0 Apr 28 19:16:43.265744 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:43.265679 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" event={"ID":"7ce95a4f961ae9992c062db4f234920c","Type":"ContainerDied","Data":"8094352f897418a0681c80e0ad752acfb442b7e913c85820f24249b838df12cd"} Apr 28 19:16:43.282742 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:43.281567 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-22.ec2.internal" podStartSLOduration=3.281550324 podStartE2EDuration="3.281550324s" podCreationTimestamp="2026-04-28 19:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:42.229940872 +0000 UTC m=+3.580382275" watchObservedRunningTime="2026-04-28 19:16:43.281550324 +0000 UTC m=+4.631991731" Apr 28 19:16:44.190931 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:44.190890 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:44.191182 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:44.191036 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:44.277406 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:44.277343 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" event={"ID":"7ce95a4f961ae9992c062db4f234920c","Type":"ContainerStarted","Data":"c80ba7ba54aa678436594d1e705c685cffdaf19b475dcc8723642c4456a32307"} Apr 28 19:16:44.731263 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:44.731226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:44.731457 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:44.731338 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:44.731457 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:44.731393 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.731379615 +0000 UTC m=+10.081821000 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:44.832607 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:44.832569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:44.832797 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:44.832751 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:44.832797 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:44.832769 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:44.832797 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:44.832782 2578 projected.go:194] Error preparing data for projected volume kube-api-access-gs2xn for pod openshift-network-diagnostics/network-check-target-k9zr5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:44.832954 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:44.832842 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn podName:27090a69-2cdb-4eae-a82d-5fa7351f8654 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.832822238 +0000 UTC m=+10.183263636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs2xn" (UniqueName: "kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn") pod "network-check-target-k9zr5" (UID: "27090a69-2cdb-4eae-a82d-5fa7351f8654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:45.191661 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:45.191167 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:45.191661 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:45.191295 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:46.191457 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:46.190794 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:46.191457 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:46.190976 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:47.199728 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:47.199687 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:47.200179 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:47.199882 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:48.191256 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:48.191218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:48.191433 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:48.191362 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:48.769554 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:48.769514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:48.770025 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:48.769708 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:48.770025 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:48.769776 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:56.769755543 +0000 UTC m=+18.120196926 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:48.871009 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:48.870361 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:48.871009 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:48.870538 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:48.871009 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:48.870558 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:48.871009 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:48.870572 2578 projected.go:194] Error preparing data for projected volume kube-api-access-gs2xn for pod openshift-network-diagnostics/network-check-target-k9zr5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:48.871009 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:48.870653 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn podName:27090a69-2cdb-4eae-a82d-5fa7351f8654 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:56.870614196 +0000 UTC m=+18.221055584 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs2xn" (UniqueName: "kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn") pod "network-check-target-k9zr5" (UID: "27090a69-2cdb-4eae-a82d-5fa7351f8654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:49.194047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:49.194012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:49.194237 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:49.194147 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:50.191221 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:50.191185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:50.191658 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:50.191325 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:51.191254 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:51.191163 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:51.191702 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:51.191297 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:52.191259 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:52.191217 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:52.191692 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:52.191359 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:53.191875 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:53.191834 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:53.192338 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:53.191978 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:54.191354 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.191324 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:54.191541 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:54.191445 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:54.252389 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.252344 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-22.ec2.internal" podStartSLOduration=14.252325853 podStartE2EDuration="14.252325853s" podCreationTimestamp="2026-04-28 19:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:44.293506582 +0000 UTC m=+5.643948014" watchObservedRunningTime="2026-04-28 19:16:54.252325853 +0000 UTC m=+15.602767304" Apr 28 19:16:54.253003 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.252984 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7mtgn"] Apr 28 19:16:54.255746 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.255713 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.255850 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:54.255798 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:16:54.315894 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.315863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-kubelet-config\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.316047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.315905 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-dbus\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.316047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.315944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.416782 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.416745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-kubelet-config\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.416987 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.416806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-dbus\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.416987 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.416836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.416987 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.416878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-kubelet-config\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.417145 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.417000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-dbus\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.417145 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:54.416999 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:54.417145 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:54.417071 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret podName:3eaec6bb-3277-478e-9ecc-a557fa5a5b7f nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.917056136 +0000 UTC m=+16.267497520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret") pod "global-pull-secret-syncer-7mtgn" (UID: "3eaec6bb-3277-478e-9ecc-a557fa5a5b7f") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:54.920282 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:54.920239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:54.920469 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:54.920413 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:54.920525 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:54.920476 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret podName:3eaec6bb-3277-478e-9ecc-a557fa5a5b7f nodeName:}" failed. No retries permitted until 2026-04-28 19:16:55.920459572 +0000 UTC m=+17.270900961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret") pod "global-pull-secret-syncer-7mtgn" (UID: "3eaec6bb-3277-478e-9ecc-a557fa5a5b7f") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:55.191800 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:55.191713 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:55.191966 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:55.191848 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:55.928174 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:55.927547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:55.928174 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:55.927752 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:55.928174 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:55.927817 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret podName:3eaec6bb-3277-478e-9ecc-a557fa5a5b7f nodeName:}" failed. No retries permitted until 2026-04-28 19:16:57.927798286 +0000 UTC m=+19.278239672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret") pod "global-pull-secret-syncer-7mtgn" (UID: "3eaec6bb-3277-478e-9ecc-a557fa5a5b7f") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:56.191212 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:56.191131 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:56.191364 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:56.191131 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:56.191364 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:56.191257 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:16:56.191364 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:56.191348 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:56.833472 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:56.833427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:56.833672 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:56.833572 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:56.833672 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:56.833663 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:12.833625677 +0000 UTC m=+34.184067077 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:56.934534 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:56.934496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:56.935012 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:56.934706 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:56.935012 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:56.934729 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:56.935012 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:56.934739 2578 projected.go:194] Error preparing data for projected volume kube-api-access-gs2xn for pod openshift-network-diagnostics/network-check-target-k9zr5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:56.935012 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:56.934793 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn podName:27090a69-2cdb-4eae-a82d-5fa7351f8654 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:12.934779688 +0000 UTC m=+34.285221071 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs2xn" (UniqueName: "kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn") pod "network-check-target-k9zr5" (UID: "27090a69-2cdb-4eae-a82d-5fa7351f8654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:57.190917 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:57.190886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:57.191202 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:57.190988 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:57.943043 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:57.942997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:57.943483 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:57.943168 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:57.943483 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:57.943246 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret podName:3eaec6bb-3277-478e-9ecc-a557fa5a5b7f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.943225563 +0000 UTC m=+23.293666950 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret") pod "global-pull-secret-syncer-7mtgn" (UID: "3eaec6bb-3277-478e-9ecc-a557fa5a5b7f") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:58.191502 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:58.191472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:16:58.191502 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:58.191512 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:16:58.191708 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:58.191603 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:16:58.191765 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:58.191737 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:16:59.195015 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.194804 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:16:59.195853 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:16:59.195101 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:16:59.302172 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.302146 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" event={"ID":"6abb021c-2028-4afc-a02f-952af6060a13","Type":"ContainerStarted","Data":"c72323de6e64cc221ac488671c8c3cb40fbaae6987b31b28ba041e5e8776137a"} Apr 28 19:16:59.303456 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.303433 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4ddb" event={"ID":"6544e7a1-69d4-41e0-b18d-961cdaa5418d","Type":"ContainerStarted","Data":"353cc549769cc70f3cd48687d28e5b29c012f034e2cc3220d17b2a81bb962349"} Apr 28 19:16:59.304576 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.304551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerStarted","Data":"a0e534dbd1b75a9a0e57ced8151edc35c5df1d081683008de84aa666ba61d8b4"} Apr 28 19:16:59.305810 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.305787 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" event={"ID":"b9a9c874-a655-4f8e-9492-86265496a4e7","Type":"ContainerStarted","Data":"7383d9e25fa2967b2aab4f572e9a58a2b6a0c2a65821c82b9b7f6d94e94cb2c5"} Apr 28 19:16:59.306945 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.306924 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hp74n" event={"ID":"692b128d-82a4-4c26-b17d-0b4d804ef295","Type":"ContainerStarted","Data":"015b653fb67a9d2abd7768e3f32f8f954bb9728a16a06984ac9249972486bbf0"} Apr 28 19:16:59.308074 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.308052 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"57da24d925a30da155a187eb0ce456c4367d7660690f1c2e81ee2ec8f090f2eb"} Apr 28 19:16:59.309568 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.309546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bgmp8" event={"ID":"d5c1a9d5-7a1d-4369-837a-3ed96d5f107f","Type":"ContainerStarted","Data":"dc12af6e2bef883c61715e4d22a0c0700967ecdc6ca43377f9134382c20925e7"} Apr 28 19:16:59.311254 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.311233 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p5qkw" event={"ID":"88c3f56d-6859-4f8e-a645-45fb36262479","Type":"ContainerStarted","Data":"7a18e9ef69c92675fc5d84a7eaff0fb7e6a120172bad727243998728b3b5e3cc"} Apr 28 19:16:59.322539 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.322502 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-m4ddb" podStartSLOduration=3.203641625 podStartE2EDuration="20.322493369s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.78644906 +0000 UTC m=+3.136890447" lastFinishedPulling="2026-04-28 19:16:58.905300805 +0000 UTC m=+20.255742191" observedRunningTime="2026-04-28 19:16:59.322199333 +0000 UTC m=+20.672640738" watchObservedRunningTime="2026-04-28 19:16:59.322493369 +0000 UTC m=+20.672934775" Apr 28 19:16:59.338975 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.338936 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-p5qkw" podStartSLOduration=3.262061692 podStartE2EDuration="20.338922697s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.793262702 +0000 UTC m=+3.143704095" lastFinishedPulling="2026-04-28 19:16:58.870123716 +0000 UTC m=+20.220565100" observedRunningTime="2026-04-28 19:16:59.338600749 +0000 UTC m=+20.689042152" watchObservedRunningTime="2026-04-28 19:16:59.338922697 +0000 UTC m=+20.689364103" Apr 28 19:16:59.368820 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.368777 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hp74n" podStartSLOduration=3.290665314 podStartE2EDuration="20.368762411s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.791957707 +0000 UTC m=+3.142399109" lastFinishedPulling="2026-04-28 19:16:58.870054818 +0000 UTC m=+20.220496206" observedRunningTime="2026-04-28 19:16:59.368471309 +0000 UTC m=+20.718912715" watchObservedRunningTime="2026-04-28 19:16:59.368762411 +0000 UTC m=+20.719203846" Apr 28 19:16:59.463110 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:16:59.463070 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bgmp8" podStartSLOduration=3.365538256 podStartE2EDuration="20.463056492s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.796814081 +0000 UTC m=+3.147255464" lastFinishedPulling="2026-04-28 19:16:58.894332317 +0000 UTC m=+20.244773700" observedRunningTime="2026-04-28 19:16:59.427235287 +0000 UTC m=+20.777676705" watchObservedRunningTime="2026-04-28 19:16:59.463056492 +0000 UTC m=+20.813497897" Apr 28 19:17:00.191558 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.191488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:00.191943 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.191488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:00.191943 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:00.191591 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:17:00.191943 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:00.191685 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:17:00.314675 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.314625 2578 generic.go:358] "Generic (PLEG): container finished" podID="d18eaae1-d122-4fa3-8b2e-ffc7868bfd03" containerID="a0e534dbd1b75a9a0e57ced8151edc35c5df1d081683008de84aa666ba61d8b4" exitCode=0 Apr 28 19:17:00.314675 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.314667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerDied","Data":"a0e534dbd1b75a9a0e57ced8151edc35c5df1d081683008de84aa666ba61d8b4"} Apr 28 19:17:00.316913 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.316897 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:17:00.317184 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.317166 2578 generic.go:358] "Generic (PLEG): container finished" podID="1fe57666-24f8-4a83-ae5a-59f5b12c7a9e" containerID="d5cf3f968cbb25788fd47b3ceb73b2200f3289ed448058de1835b3f343a3b488" exitCode=1 Apr 28 19:17:00.317287 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.317263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"dfcf78d038a76a13eacdef85d0eaab7174d7dcff2d1bbee15402495d510b1336"} Apr 28 19:17:00.317344 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.317301 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"262a3b616164988b883dc7c8fb6dc349fe21427180838b4da2345752ca8532fb"} Apr 28 19:17:00.317344 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.317317 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"679f69603af2cf4c2c7900a73a5ec720ff0a8e599f384d21b35b82bd4203cb29"} Apr 28 19:17:00.317344 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.317330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"aadfa02a66205a12b07373db9146d829ebf49b9af29f03638390fa10cf7912eb"} Apr 28 19:17:00.317448 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.317343 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerDied","Data":"d5cf3f968cbb25788fd47b3ceb73b2200f3289ed448058de1835b3f343a3b488"} Apr 28 19:17:00.332045 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.332002 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-sc7j5" podStartSLOduration=4.255494232 podStartE2EDuration="21.331989636s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.793540515 +0000 UTC m=+3.143981913" lastFinishedPulling="2026-04-28 19:16:58.870035931 +0000 UTC m=+20.220477317" observedRunningTime="2026-04-28 19:16:59.463621289 +0000 UTC m=+20.814062694" watchObservedRunningTime="2026-04-28 19:17:00.331989636 +0000 UTC m=+21.682431041" Apr 28 19:17:00.378258 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:00.378231 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:17:01.155575 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:01.155456 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:17:00.378252197Z","UUID":"dc389f70-e742-4379-941a-161ee5127c24","Handler":null,"Name":"","Endpoint":""} Apr 28 19:17:01.159281 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:01.159244 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:17:01.159281 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:01.159277 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:17:01.191917 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:01.191885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:01.192069 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:01.192004 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:17:01.321099 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:01.321057 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jgrh8" event={"ID":"2352f752-8d71-483d-9d43-b79ba63f8cad","Type":"ContainerStarted","Data":"33ec338b60c072bd06046448efaa0f35a109e508c8ff83f1daa1025b5caa72ab"} Apr 28 19:17:01.322923 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:01.322895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" event={"ID":"6abb021c-2028-4afc-a02f-952af6060a13","Type":"ContainerStarted","Data":"f09f0ec86de0e98217950c3c333e154e271fc71d75acbf40e48774ede6666795"} Apr 28 19:17:01.340680 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:01.340622 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jgrh8" podStartSLOduration=5.257882842 podStartE2EDuration="22.340605703s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.7873122 +0000 UTC m=+3.137753588" lastFinishedPulling="2026-04-28 19:16:58.870035065 +0000 UTC m=+20.220476449" observedRunningTime="2026-04-28 19:17:01.339991747 +0000 UTC m=+22.690433152" watchObservedRunningTime="2026-04-28 19:17:01.340605703 +0000 UTC m=+22.691047108" Apr 28 19:17:01.974030 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:01.973998 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:01.974220 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:01.974150 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:17:01.974315 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:01.974223 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret podName:3eaec6bb-3277-478e-9ecc-a557fa5a5b7f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:09.974201109 +0000 UTC m=+31.324642505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret") pod "global-pull-secret-syncer-7mtgn" (UID: "3eaec6bb-3277-478e-9ecc-a557fa5a5b7f") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:17:02.191363 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:02.191336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:02.191494 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:02.191336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:02.191552 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:02.191502 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:17:02.191684 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:02.191655 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:17:02.327610 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:02.327082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" event={"ID":"6abb021c-2028-4afc-a02f-952af6060a13","Type":"ContainerStarted","Data":"553a2578994e1c9099b7d455c91585657058ba230b22ed01b76b822a9408aad0"} Apr 28 19:17:02.330174 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:02.330147 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:17:02.330762 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:02.330731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"50be981e9796f51c9708868ef1253aa1947c607f5480ad03dadd3c70b1b9411c"} Apr 28 19:17:02.348045 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:02.347997 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mtzws" podStartSLOduration=3.632198014 podStartE2EDuration="23.34798144s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.790296099 +0000 UTC m=+3.140737482" lastFinishedPulling="2026-04-28 19:17:01.506079511 +0000 UTC m=+22.856520908" observedRunningTime="2026-04-28 19:17:02.347509012 +0000 UTC m=+23.697950418" watchObservedRunningTime="2026-04-28 19:17:02.34798144 +0000 UTC m=+23.698422846" Apr 28 19:17:03.191036 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:03.190846 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:03.191273 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:03.191136 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:17:03.375754 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:03.375610 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:17:03.376416 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:03.376397 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:17:04.191420 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:04.191383 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:04.191420 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:04.191407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:04.191672 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:04.191523 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:17:04.191741 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:04.191680 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:17:04.334187 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:04.334155 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:17:04.334761 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:04.334735 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-p5qkw" Apr 28 19:17:05.191338 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.191154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:05.191970 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:05.191407 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:17:05.337189 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.337156 2578 generic.go:358] "Generic (PLEG): container finished" podID="d18eaae1-d122-4fa3-8b2e-ffc7868bfd03" containerID="ff01edc589494d8a0e28b3836db66d1141efb6e11c3bc80c803b700687ad0948" exitCode=0 Apr 28 19:17:05.337367 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.337230 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerDied","Data":"ff01edc589494d8a0e28b3836db66d1141efb6e11c3bc80c803b700687ad0948"} Apr 28 19:17:05.340233 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.340210 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:17:05.340530 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.340504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"0e13a0d0e51fa509444208e6afe59c0aeef0f717e6194d4c9041ba08911c9d94"} Apr 28 19:17:05.340857 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.340821 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:17:05.340963 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.340861 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:17:05.341008 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.340991 2578 scope.go:117] "RemoveContainer" containerID="d5cf3f968cbb25788fd47b3ceb73b2200f3289ed448058de1835b3f343a3b488" Apr 28 19:17:05.355767 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:05.355748 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:17:06.190982 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.190951 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:06.191204 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.190951 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:06.191204 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:06.191061 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:17:06.191204 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:06.191163 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:17:06.345878 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.345805 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:17:06.346427 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.346398 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" event={"ID":"1fe57666-24f8-4a83-ae5a-59f5b12c7a9e","Type":"ContainerStarted","Data":"1ee1366644f2e0f7d7c703a0b8719d317976daff601628d1f5889fb7fe659dd3"} Apr 28 19:17:06.347113 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.347086 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:17:06.365321 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.365156 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:17:06.381363 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.381310 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" podStartSLOduration=9.999583858 podStartE2EDuration="27.381291399s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.783764831 +0000 UTC m=+3.134206215" lastFinishedPulling="2026-04-28 19:16:59.165472354 +0000 UTC m=+20.515913756" observedRunningTime="2026-04-28 19:17:06.379828593 +0000 UTC m=+27.730270001" watchObservedRunningTime="2026-04-28 19:17:06.381291399 +0000 UTC m=+27.731732809" Apr 28 19:17:06.384368 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.384339 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7mtgn"] Apr 28 19:17:06.384460 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.384431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:06.384542 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:06.384520 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:17:06.388937 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.388910 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2ssxm"] Apr 28 19:17:06.389662 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.389148 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:06.389662 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:06.389291 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:17:06.391722 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.391699 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k9zr5"] Apr 28 19:17:06.391829 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:06.391816 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:06.391941 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:06.391890 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:17:07.350046 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:07.350002 2578 generic.go:358] "Generic (PLEG): container finished" podID="d18eaae1-d122-4fa3-8b2e-ffc7868bfd03" containerID="7a6564c0f820df45ae6ce8a0e38411f7048ea3dca422e6d8fca090ccb0964148" exitCode=0 Apr 28 19:17:07.350534 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:07.350082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerDied","Data":"7a6564c0f820df45ae6ce8a0e38411f7048ea3dca422e6d8fca090ccb0964148"} Apr 28 19:17:08.191266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:08.191237 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:08.191429 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:08.191237 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:08.191429 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:08.191333 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:17:08.191429 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:08.191400 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:17:08.191429 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:08.191236 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:08.191561 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:08.191466 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:17:09.356091 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:09.355912 2578 generic.go:358] "Generic (PLEG): container finished" podID="d18eaae1-d122-4fa3-8b2e-ffc7868bfd03" containerID="61f04d8d322548758ae98a49c1a656de9cc14cd9046796aeee46d8427620d443" exitCode=0 Apr 28 19:17:09.356440 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:09.355990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerDied","Data":"61f04d8d322548758ae98a49c1a656de9cc14cd9046796aeee46d8427620d443"} Apr 28 19:17:10.033588 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:10.033559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:10.033838 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:10.033710 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:17:10.033838 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:10.033768 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret podName:3eaec6bb-3277-478e-9ecc-a557fa5a5b7f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:26.03375035 +0000 UTC m=+47.384191734 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret") pod "global-pull-secret-syncer-7mtgn" (UID: "3eaec6bb-3277-478e-9ecc-a557fa5a5b7f") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:17:10.191747 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:10.191713 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:10.191931 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:10.191844 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:10.191931 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:10.191850 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7mtgn" podUID="3eaec6bb-3277-478e-9ecc-a557fa5a5b7f" Apr 28 19:17:10.191931 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:10.191881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:10.192094 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:10.191952 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9zr5" podUID="27090a69-2cdb-4eae-a82d-5fa7351f8654" Apr 28 19:17:10.192094 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:10.192026 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:17:11.944366 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:11.944296 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-22.ec2.internal" event="NodeReady" Apr 28 19:17:11.944975 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:11.944449 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:17:11.986322 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:11.986291 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn"] Apr 28 19:17:12.006439 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.006064 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5bb764ccc7-ppc7d"] Apr 28 19:17:12.021754 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.021724 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf"] Apr 28 19:17:12.022251 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.022221 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.027373 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.027353 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-m25ps\"" Apr 28 19:17:12.027497 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.027409 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 28 19:17:12.027497 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.027423 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 28 19:17:12.027701 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.027686 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 28 19:17:12.034296 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.034273 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5"] Apr 28 19:17:12.034406 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.034389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:12.034570 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.034548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" Apr 28 19:17:12.035210 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.035190 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 28 19:17:12.037911 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.037744 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 28 19:17:12.037911 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.037767 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 28 19:17:12.037911 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.037790 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bssxn\"" Apr 28 19:17:12.037911 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.037751 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 28 19:17:12.044866 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.044843 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 28 19:17:12.045077 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.045056 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 28 19:17:12.045856 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.045835 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 28 19:17:12.045960 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.045872 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-z9gfq\"" Apr 28 19:17:12.053107 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.053088 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr"] Apr 28 19:17:12.053259 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.053241 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.056538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.056520 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 28 19:17:12.056538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.056533 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 28 19:17:12.057061 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.057045 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 28 19:17:12.057193 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.057046 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 28 19:17:12.071665 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.071618 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn"] Apr 28 19:17:12.071665 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.071665 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr"] Apr 28 19:17:12.071790 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.071678 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf"] Apr 28 19:17:12.071790 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.071690 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5"] Apr 28 19:17:12.071790 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.071704 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wlbdc"] Apr 28 19:17:12.071790 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.071762 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.074968 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.074947 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 28 19:17:12.090529 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.090509 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bb764ccc7-ppc7d"] Apr 28 19:17:12.090529 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.090530 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wlbdc"] Apr 28 19:17:12.090704 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.090626 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:12.093541 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.093519 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:17:12.093652 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.093603 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-524nx\"" Apr 28 19:17:12.093864 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.093847 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:17:12.095327 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.095311 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:17:12.108839 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.108817 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-c55mw"] Apr 28 19:17:12.130517 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.130481 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c55mw"] Apr 28 19:17:12.130665 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.130606 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.133220 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.133198 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:17:12.134372 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.134354 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:17:12.134480 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.134357 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jtx2j\"" Apr 28 19:17:12.147807 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.147781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:12.147891 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.147825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.147931 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.147886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-bound-sa-token\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.147986 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.147943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a00e9848-ea32-4408-8214-1a5a27b0ffb7-tmp\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.147986 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.147972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz579\" (UniqueName: \"kubernetes.io/projected/a00e9848-ea32-4408-8214-1a5a27b0ffb7-kube-api-access-gz579\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.148075 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.147996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-image-registry-private-configuration\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.148075 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bda9f335-17a3-4fe7-8eb6-ba81a4068222-ca-trust-extracted\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.148075 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:12.148218 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-certificates\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.148218 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148136 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-trusted-ca\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.148218 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pmwf\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-kube-api-access-7pmwf\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.148218 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e649bde4-cf50-48a9-ad53-c5bbf78f92c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77ff884f47-h59gf\" (UID: \"e649bde4-cf50-48a9-ad53-c5bbf78f92c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" Apr 28 19:17:12.148359 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148225 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-ca\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.148359 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a00e9848-ea32-4408-8214-1a5a27b0ffb7-klusterlet-config\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.148359 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148289 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.148359 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv8j5\" (UniqueName: \"kubernetes.io/projected/e649bde4-cf50-48a9-ad53-c5bbf78f92c5-kube-api-access-bv8j5\") pod \"managed-serviceaccount-addon-agent-77ff884f47-h59gf\" (UID: \"e649bde4-cf50-48a9-ad53-c5bbf78f92c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" Apr 28 19:17:12.148359 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148319 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-hub\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.148359 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148349 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.148561 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zgdr\" (UniqueName: \"kubernetes.io/projected/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-kube-api-access-7zgdr\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.148561 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148401 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-installation-pull-secrets\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.148561 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.148424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.191295 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.191266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:12.191460 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.191265 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:12.191521 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.191275 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:12.194243 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.194222 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:17:12.194426 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.194384 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cnh2p\"" Apr 28 19:17:12.194426 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.194400 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:17:12.194540 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.194493 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:17:12.194599 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.194588 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9tnjq\"" Apr 28 19:17:12.194706 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.194694 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:17:12.249858 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.249818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-image-registry-private-configuration\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.250010 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.249892 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bda9f335-17a3-4fe7-8eb6-ba81a4068222-ca-trust-extracted\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.250010 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.249932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv8j5\" (UniqueName: \"kubernetes.io/projected/e649bde4-cf50-48a9-ad53-c5bbf78f92c5-kube-api-access-bv8j5\") pod \"managed-serviceaccount-addon-agent-77ff884f47-h59gf\" (UID: \"e649bde4-cf50-48a9-ad53-c5bbf78f92c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" Apr 28 19:17:12.250010 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.249955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-hub\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.250010 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.249987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-trusted-ca\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a00e9848-ea32-4408-8214-1a5a27b0ffb7-klusterlet-config\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250076 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zgdr\" (UniqueName: \"kubernetes.io/projected/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-kube-api-access-7zgdr\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8327b8b7-48d4-4d18-bec4-8cea6c826302-tmp-dir\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-installation-pull-secrets\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a00e9848-ea32-4408-8214-1a5a27b0ffb7-tmp\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz579\" (UniqueName: \"kubernetes.io/projected/a00e9848-ea32-4408-8214-1a5a27b0ffb7-kube-api-access-gz579\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e649bde4-cf50-48a9-ad53-c5bbf78f92c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77ff884f47-h59gf\" (UID: \"e649bde4-cf50-48a9-ad53-c5bbf78f92c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" Apr 28 19:17:12.250222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.250709 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:12.250709 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250294 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-certificates\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.250709 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250365 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bda9f335-17a3-4fe7-8eb6-ba81a4068222-ca-trust-extracted\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.250709 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.250648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a00e9848-ea32-4408-8214-1a5a27b0ffb7-tmp\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.251374 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-certificates\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.251374 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pmwf\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-kube-api-access-7pmwf\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.251374 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251323 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-ca\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.251374 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251350 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.251619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.251619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251416 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxxx\" (UniqueName: \"kubernetes.io/projected/b5ab40ee-0c46-43db-8a80-02e47728a72f-kube-api-access-sfxxx\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:12.251619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.251619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8327b8b7-48d4-4d18-bec4-8cea6c826302-config-volume\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.251619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptddq\" (UniqueName: \"kubernetes.io/projected/8327b8b7-48d4-4d18-bec4-8cea6c826302-kube-api-access-ptddq\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.251619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:12.251619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.251619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.251603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-bound-sa-token\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.252012 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.251965 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:12.252012 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.251982 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:17:12.252106 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.252036 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:12.752017971 +0000 UTC m=+34.102459354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:17:12.252339 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.252316 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:12.252415 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.252375 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:12.752358761 +0000 UTC m=+34.102800146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:17:12.252650 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.252612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-trusted-ca\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.253266 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.253239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.253701 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.253677 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:12.256293 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.255541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.256293 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.255783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-image-registry-private-configuration\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.256293 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.255865 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-ca\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.256293 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.255934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.256293 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.256061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-hub\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.256724 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.256680 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e649bde4-cf50-48a9-ad53-c5bbf78f92c5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77ff884f47-h59gf\" (UID: \"e649bde4-cf50-48a9-ad53-c5bbf78f92c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" Apr 28 19:17:12.256970 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.256952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-installation-pull-secrets\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.257303 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.257263 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a00e9848-ea32-4408-8214-1a5a27b0ffb7-klusterlet-config\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.265203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.265153 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz579\" (UniqueName: \"kubernetes.io/projected/a00e9848-ea32-4408-8214-1a5a27b0ffb7-kube-api-access-gz579\") pod \"klusterlet-addon-workmgr-857df48f6f-7m5wr\" (UID: \"a00e9848-ea32-4408-8214-1a5a27b0ffb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.265857 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.265832 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-bound-sa-token\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.266823 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.266368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pmwf\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-kube-api-access-7pmwf\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.269784 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.269764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv8j5\" (UniqueName: \"kubernetes.io/projected/e649bde4-cf50-48a9-ad53-c5bbf78f92c5-kube-api-access-bv8j5\") pod \"managed-serviceaccount-addon-agent-77ff884f47-h59gf\" (UID: \"e649bde4-cf50-48a9-ad53-c5bbf78f92c5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" Apr 28 19:17:12.269866 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.269803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zgdr\" (UniqueName: \"kubernetes.io/projected/67ae9b4c-6c63-4813-b08e-8ec2f3197cdf-kube-api-access-7zgdr\") pod \"cluster-proxy-proxy-agent-86d4c6b76c-ncnd5\" (UID: \"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.352271 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.352232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxxx\" (UniqueName: \"kubernetes.io/projected/b5ab40ee-0c46-43db-8a80-02e47728a72f-kube-api-access-sfxxx\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:12.352453 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.352300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8327b8b7-48d4-4d18-bec4-8cea6c826302-config-volume\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.352453 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.352322 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptddq\" (UniqueName: \"kubernetes.io/projected/8327b8b7-48d4-4d18-bec4-8cea6c826302-kube-api-access-ptddq\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.352453 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.352415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:12.352453 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.352451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8327b8b7-48d4-4d18-bec4-8cea6c826302-tmp-dir\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.352678 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.352529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.352678 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.352662 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:12.352783 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.352671 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:12.352783 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.352731 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:12.852711335 +0000 UTC m=+34.203152721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:17:12.352783 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.352749 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:12.852739708 +0000 UTC m=+34.203181092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:17:12.353138 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.353116 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8327b8b7-48d4-4d18-bec4-8cea6c826302-tmp-dir\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.353259 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.353238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8327b8b7-48d4-4d18-bec4-8cea6c826302-config-volume\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.359219 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.359199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" Apr 28 19:17:12.367203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.367175 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:17:12.369377 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.369356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptddq\" (UniqueName: \"kubernetes.io/projected/8327b8b7-48d4-4d18-bec4-8cea6c826302-kube-api-access-ptddq\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.369844 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.369774 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxxx\" (UniqueName: \"kubernetes.io/projected/b5ab40ee-0c46-43db-8a80-02e47728a72f-kube-api-access-sfxxx\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:12.396908 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.396493 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:12.554271 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.554018 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf"] Apr 28 19:17:12.560965 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:17:12.560940 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode649bde4_cf50_48a9_ad53_c5bbf78f92c5.slice/crio-778efea126e2656f8f1678b8c74fa067c65635482b9fe256dbcd745d24e67c8d WatchSource:0}: Error finding container 778efea126e2656f8f1678b8c74fa067c65635482b9fe256dbcd745d24e67c8d: Status 404 returned error can't find the container with id 778efea126e2656f8f1678b8c74fa067c65635482b9fe256dbcd745d24e67c8d Apr 28 19:17:12.566097 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.565890 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr"] Apr 28 19:17:12.570298 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:17:12.570269 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda00e9848_ea32_4408_8214_1a5a27b0ffb7.slice/crio-c2ed8dec860637f9c0358556fde930f57946075382e1f38f5c675667bbb74cf4 WatchSource:0}: Error finding container c2ed8dec860637f9c0358556fde930f57946075382e1f38f5c675667bbb74cf4: Status 404 returned error can't find the container with id c2ed8dec860637f9c0358556fde930f57946075382e1f38f5c675667bbb74cf4 Apr 28 19:17:12.577377 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.577354 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5"] Apr 28 19:17:12.580414 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:17:12.580390 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67ae9b4c_6c63_4813_b08e_8ec2f3197cdf.slice/crio-94268b488b0bf296fe54ba8bb96767aeac8fcc6d167f712731996ae4ca991254 WatchSource:0}: Error finding container 94268b488b0bf296fe54ba8bb96767aeac8fcc6d167f712731996ae4ca991254: Status 404 returned error can't find the container with id 94268b488b0bf296fe54ba8bb96767aeac8fcc6d167f712731996ae4ca991254 Apr 28 19:17:12.758217 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.758125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:12.758217 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.758189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:12.758458 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.758283 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:12.758458 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.758298 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:17:12.758458 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.758349 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:13.758334745 +0000 UTC m=+35.108776129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:17:12.758458 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.758362 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:12.758458 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.758428 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:13.758407743 +0000 UTC m=+35.108849131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:17:12.859566 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.859527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:12.859782 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.859595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:12.859782 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.859627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:12.859782 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.859722 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:12.859782 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.859739 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:12.859782 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.859738 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:17:12.860023 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.859791 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:13.859775065 +0000 UTC m=+35.210216449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:17:12.860023 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.859807 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:44.859800885 +0000 UTC m=+66.210242268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : secret "metrics-daemon-secret" not found Apr 28 19:17:12.860023 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:12.859821 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:13.859812872 +0000 UTC m=+35.210254258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:17:12.960359 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.960322 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:12.964354 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:12.964324 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs2xn\" (UniqueName: \"kubernetes.io/projected/27090a69-2cdb-4eae-a82d-5fa7351f8654-kube-api-access-gs2xn\") pod \"network-check-target-k9zr5\" (UID: \"27090a69-2cdb-4eae-a82d-5fa7351f8654\") " pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:13.115972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:13.115940 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:13.365473 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:13.365432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" event={"ID":"e649bde4-cf50-48a9-ad53-c5bbf78f92c5","Type":"ContainerStarted","Data":"778efea126e2656f8f1678b8c74fa067c65635482b9fe256dbcd745d24e67c8d"} Apr 28 19:17:13.366763 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:13.366685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" event={"ID":"a00e9848-ea32-4408-8214-1a5a27b0ffb7","Type":"ContainerStarted","Data":"c2ed8dec860637f9c0358556fde930f57946075382e1f38f5c675667bbb74cf4"} Apr 28 19:17:13.367795 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:13.367773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" event={"ID":"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf","Type":"ContainerStarted","Data":"94268b488b0bf296fe54ba8bb96767aeac8fcc6d167f712731996ae4ca991254"} Apr 28 19:17:13.766945 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:13.766864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:13.767173 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:13.767009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:13.767173 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.767044 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:13.767173 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.767099 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:13.767173 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.767109 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:17:13.767173 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.767128 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:15.767107382 +0000 UTC m=+37.117548779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:17:13.767173 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.767146 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:15.7671356 +0000 UTC m=+37.117576985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:17:13.867619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:13.867578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:13.867840 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:13.867686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:13.867840 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.867740 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:13.867840 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.867802 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:13.867840 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.867814 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:15.867794558 +0000 UTC m=+37.218235949 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:17:13.868042 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:13.867849 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:15.867834483 +0000 UTC m=+37.218275866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:17:15.195791 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:15.195759 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k9zr5"] Apr 28 19:17:15.232887 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:17:15.232855 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27090a69_2cdb_4eae_a82d_5fa7351f8654.slice/crio-2b0553053b685eb6fd45bf9f25ac9d646c3fe49cd85ae4b57e17a8a643b20698 WatchSource:0}: Error finding container 2b0553053b685eb6fd45bf9f25ac9d646c3fe49cd85ae4b57e17a8a643b20698: Status 404 returned error can't find the container with id 2b0553053b685eb6fd45bf9f25ac9d646c3fe49cd85ae4b57e17a8a643b20698 Apr 28 19:17:15.373610 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:15.373575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k9zr5" event={"ID":"27090a69-2cdb-4eae-a82d-5fa7351f8654","Type":"ContainerStarted","Data":"2b0553053b685eb6fd45bf9f25ac9d646c3fe49cd85ae4b57e17a8a643b20698"} Apr 28 19:17:15.786111 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:15.786063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:15.786291 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:15.786140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:15.786291 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.786281 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:15.786407 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.786346 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:19.786326586 +0000 UTC m=+41.136767970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:17:15.786470 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.786420 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:15.786470 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.786432 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:17:15.786470 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.786466 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:19.786454754 +0000 UTC m=+41.136896141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:17:15.887470 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:15.887381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:15.887675 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:15.887511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:15.887748 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.887724 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:15.887921 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.887894 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:19.887874078 +0000 UTC m=+41.238315464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:17:15.888399 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.888364 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:15.888498 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:15.888423 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:19.888406038 +0000 UTC m=+41.238847423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:17:16.381895 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:16.380992 2578 generic.go:358] "Generic (PLEG): container finished" podID="d18eaae1-d122-4fa3-8b2e-ffc7868bfd03" containerID="1454b5a30f61ba15918b6af2f46f332b0b0db2b298853c34df4236a84bdbe0e7" exitCode=0 Apr 28 19:17:16.381895 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:16.381078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerDied","Data":"1454b5a30f61ba15918b6af2f46f332b0b0db2b298853c34df4236a84bdbe0e7"} Apr 28 19:17:17.389018 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:17.388132 2578 generic.go:358] "Generic (PLEG): container finished" podID="d18eaae1-d122-4fa3-8b2e-ffc7868bfd03" containerID="3d1c5838124c5d49609267a3b1eea71f9bba66347be7a4d718be06aa456a7d3a" exitCode=0 Apr 28 19:17:17.389018 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:17.388413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerDied","Data":"3d1c5838124c5d49609267a3b1eea71f9bba66347be7a4d718be06aa456a7d3a"} Apr 28 19:17:19.830602 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:19.830562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:19.831067 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:19.830623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:19.831067 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.830751 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:19.831067 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.830774 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:17:19.831067 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.830832 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:27.830810212 +0000 UTC m=+49.181251613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:17:19.831067 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.830755 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:19.831067 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.830894 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:27.830882393 +0000 UTC m=+49.181323778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:17:19.931978 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:19.931942 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:19.932184 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:19.932043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:19.932184 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.932100 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:19.932184 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.932132 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:19.932184 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.932178 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:27.932151664 +0000 UTC m=+49.282593068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:17:19.932404 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:19.932192 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:27.932186505 +0000 UTC m=+49.282627889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:17:22.401491 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.401451 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" event={"ID":"d18eaae1-d122-4fa3-8b2e-ffc7868bfd03","Type":"ContainerStarted","Data":"a6e30c64c925d1a06fcdb01904ecc3b81b6184741264cbe9ddf847d6c5886917"} Apr 28 19:17:22.402746 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.402704 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" event={"ID":"a00e9848-ea32-4408-8214-1a5a27b0ffb7","Type":"ContainerStarted","Data":"e064e72558da68018ea368c3fef48903ae3a7d336ad8fa97610ef815465c3ac4"} Apr 28 19:17:22.402881 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.402852 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:22.403977 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.403955 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k9zr5" event={"ID":"27090a69-2cdb-4eae-a82d-5fa7351f8654","Type":"ContainerStarted","Data":"c14940b5b09f812b82b896d577ab749bddfe1f1e5df09fa3d6fcb0b331669f2d"} Apr 28 19:17:22.404120 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.404094 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:17:22.404733 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.404714 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:17:22.405318 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.405300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" event={"ID":"e649bde4-cf50-48a9-ad53-c5bbf78f92c5","Type":"ContainerStarted","Data":"4c46335da42639d40ba31c70141f5dccacb74347900465f6030e2598a38ce9c2"} Apr 28 19:17:22.406488 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.406469 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" event={"ID":"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf","Type":"ContainerStarted","Data":"1d95f5c98fb21fdcabe39abf128b65c6ecff9d24a85379f6ba6feb4921c3f5f1"} Apr 28 19:17:22.434816 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.434774 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v4wsc" podStartSLOduration=9.966649904 podStartE2EDuration="43.434762781s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:16:41.794391955 +0000 UTC m=+3.144833341" lastFinishedPulling="2026-04-28 19:17:15.262504635 +0000 UTC m=+36.612946218" observedRunningTime="2026-04-28 19:17:22.434316213 +0000 UTC m=+43.784757616" watchObservedRunningTime="2026-04-28 19:17:22.434762781 +0000 UTC m=+43.785204186" Apr 28 19:17:22.454865 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.454826 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" podStartSLOduration=6.118268993 podStartE2EDuration="15.454813545s" podCreationTimestamp="2026-04-28 19:17:07 +0000 UTC" firstStartedPulling="2026-04-28 19:17:12.56328764 +0000 UTC m=+33.913729030" lastFinishedPulling="2026-04-28 19:17:21.899832192 +0000 UTC m=+43.250273582" observedRunningTime="2026-04-28 19:17:22.454266268 +0000 UTC m=+43.804707674" watchObservedRunningTime="2026-04-28 19:17:22.454813545 +0000 UTC m=+43.805254987" Apr 28 19:17:22.477345 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.477294 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-k9zr5" podStartSLOduration=36.805435501 podStartE2EDuration="43.477278857s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:17:15.237336173 +0000 UTC m=+36.587777556" lastFinishedPulling="2026-04-28 19:17:21.909179527 +0000 UTC m=+43.259620912" observedRunningTime="2026-04-28 19:17:22.47720765 +0000 UTC m=+43.827649056" watchObservedRunningTime="2026-04-28 19:17:22.477278857 +0000 UTC m=+43.827720263" Apr 28 19:17:22.495553 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:22.495507 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" podStartSLOduration=6.167477418 podStartE2EDuration="15.495491917s" podCreationTimestamp="2026-04-28 19:17:07 +0000 UTC" firstStartedPulling="2026-04-28 19:17:12.57205742 +0000 UTC m=+33.922498808" lastFinishedPulling="2026-04-28 19:17:21.900071919 +0000 UTC m=+43.250513307" observedRunningTime="2026-04-28 19:17:22.494694921 +0000 UTC m=+43.845136327" watchObservedRunningTime="2026-04-28 19:17:22.495491917 +0000 UTC m=+43.845933327" Apr 28 19:17:24.415267 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:24.415226 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" event={"ID":"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf","Type":"ContainerStarted","Data":"3629ee5738887da115828118dfe3688f78d98e7332020f936114e1bbd7016bfe"} Apr 28 19:17:24.415267 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:24.415267 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" event={"ID":"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf","Type":"ContainerStarted","Data":"0b27098da583f3b9fc70ecf1a928838c93d1fd64f04705e3a965669973bad389"} Apr 28 19:17:24.438225 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:24.438175 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" podStartSLOduration=6.02461566 podStartE2EDuration="17.438157762s" podCreationTimestamp="2026-04-28 19:17:07 +0000 UTC" firstStartedPulling="2026-04-28 19:17:12.582386978 +0000 UTC m=+33.932828365" lastFinishedPulling="2026-04-28 19:17:23.995929081 +0000 UTC m=+45.346370467" observedRunningTime="2026-04-28 19:17:24.437177512 +0000 UTC m=+45.787618916" watchObservedRunningTime="2026-04-28 19:17:24.438157762 +0000 UTC m=+45.788599166" Apr 28 19:17:26.091488 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:26.091401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:26.094957 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:26.094933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3eaec6bb-3277-478e-9ecc-a557fa5a5b7f-original-pull-secret\") pod \"global-pull-secret-syncer-7mtgn\" (UID: \"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f\") " pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:26.302903 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:26.302857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7mtgn" Apr 28 19:17:26.419280 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:26.419249 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7mtgn"] Apr 28 19:17:26.422800 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:17:26.422776 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eaec6bb_3277_478e_9ecc_a557fa5a5b7f.slice/crio-127ca736b008f21490adffa2be973ee0dac921849d51c7f9c07dc8ef1a198eba WatchSource:0}: Error finding container 127ca736b008f21490adffa2be973ee0dac921849d51c7f9c07dc8ef1a198eba: Status 404 returned error can't find the container with id 127ca736b008f21490adffa2be973ee0dac921849d51c7f9c07dc8ef1a198eba Apr 28 19:17:27.428279 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:27.428236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7mtgn" event={"ID":"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f","Type":"ContainerStarted","Data":"127ca736b008f21490adffa2be973ee0dac921849d51c7f9c07dc8ef1a198eba"} Apr 28 19:17:27.905354 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:27.905312 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:27.905542 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:27.905369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:27.905542 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:27.905517 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:27.905542 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:27.905517 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:27.905709 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:27.905544 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:17:27.905709 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:27.905591 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:43.905571826 +0000 UTC m=+65.256013216 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:17:27.905709 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:27.905611 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:43.905600457 +0000 UTC m=+65.256041845 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:17:28.006346 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:28.006309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:28.006513 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:28.006414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:28.006577 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:28.006506 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:28.006577 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:28.006536 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:28.006668 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:28.006592 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:44.006570895 +0000 UTC m=+65.357012278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:17:28.006668 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:28.006614 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:44.006604186 +0000 UTC m=+65.357045570 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:17:32.441734 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:32.441699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7mtgn" event={"ID":"3eaec6bb-3277-478e-9ecc-a557fa5a5b7f","Type":"ContainerStarted","Data":"dfd1dc8d55bd36626f022a5a90887b5a9f1ab126e08cef81dece57e7f0f0dd0d"} Apr 28 19:17:32.458541 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:32.458420 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7mtgn" podStartSLOduration=33.389016644 podStartE2EDuration="38.458401986s" podCreationTimestamp="2026-04-28 19:16:54 +0000 UTC" firstStartedPulling="2026-04-28 19:17:26.424491704 +0000 UTC m=+47.774933089" lastFinishedPulling="2026-04-28 19:17:31.493877044 +0000 UTC m=+52.844318431" observedRunningTime="2026-04-28 19:17:32.458384634 +0000 UTC m=+53.808826049" watchObservedRunningTime="2026-04-28 19:17:32.458401986 +0000 UTC m=+53.808843390" Apr 28 19:17:38.365726 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:38.365698 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ssm92" Apr 28 19:17:43.930115 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:43.930075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:17:43.930585 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:43.930127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:17:43.930585 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:43.930243 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:17:43.930585 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:43.930247 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:43.930585 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:43.930265 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:17:43.930585 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:43.930313 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:18:15.930294083 +0000 UTC m=+97.280735466 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:17:43.930585 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:43.930328 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:15.930321168 +0000 UTC m=+97.280762551 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:17:44.031019 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:44.030980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:17:44.031195 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:44.031070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:17:44.031195 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:44.031134 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:44.031280 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:44.031206 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:16.031187135 +0000 UTC m=+97.381628520 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:17:44.031280 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:44.031212 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:44.031280 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:44.031265 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:18:16.031250849 +0000 UTC m=+97.381692236 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:17:44.939016 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:44.938977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:17:44.939413 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:44.939128 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:17:44.939413 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:17:44.939189 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:48.939174637 +0000 UTC m=+130.289616022 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : secret "metrics-daemon-secret" not found Apr 28 19:17:53.413239 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:17:53.413206 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-k9zr5" Apr 28 19:18:15.981353 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:18:15.981319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:18:15.981672 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:18:15.981367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:18:15.981672 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:15.981471 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:15.981672 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:15.981483 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:18:15.981672 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:15.981491 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:18:15.981672 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:15.981543 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:19.981527047 +0000 UTC m=+161.331968431 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:18:15.981672 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:15.981559 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:19:19.981547653 +0000 UTC m=+161.331989038 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:18:16.082736 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:18:16.082687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:18:16.082911 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:18:16.082789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:18:16.082911 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:16.082828 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:18:16.082911 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:16.082885 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:18:16.082911 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:16.082894 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:19:20.082880166 +0000 UTC m=+161.433321550 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:18:16.083067 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:16.082932 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:20.082917761 +0000 UTC m=+161.433359158 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:18:49.029282 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:18:49.029239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:18:49.029807 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:49.029389 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:18:49.029807 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:18:49.029461 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs podName:96593340-195c-4a9b-8d15-babb74ebf1c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:20:51.029445977 +0000 UTC m=+252.379887366 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs") pod "network-metrics-daemon-2ssxm" (UID: "96593340-195c-4a9b-8d15-babb74ebf1c6") : secret "metrics-daemon-secret" not found Apr 28 19:19:15.035247 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:15.035179 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" podUID="bda9f335-17a3-4fe7-8eb6-ba81a4068222" Apr 28 19:19:15.047362 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:15.047330 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" podUID="92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f" Apr 28 19:19:15.102857 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:15.102820 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wlbdc" podUID="b5ab40ee-0c46-43db-8a80-02e47728a72f" Apr 28 19:19:15.145127 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:15.145096 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-c55mw" podUID="8327b8b7-48d4-4d18-bec4-8cea6c826302" Apr 28 19:19:15.210795 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:15.210750 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2ssxm" podUID="96593340-195c-4a9b-8d15-babb74ebf1c6" Apr 28 19:19:15.683357 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:15.683325 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:19:15.683527 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:15.683326 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:19:18.213760 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:18.213729 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bgmp8_d5c1a9d5-7a1d-4369-837a-3ed96d5f107f/dns-node-resolver/0.log" Apr 28 19:19:19.217015 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:19.216989 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hp74n_692b128d-82a4-4c26-b17d-0b4d804ef295/node-ca/0.log" Apr 28 19:19:20.062252 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:20.062195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") pod \"image-registry-5bb764ccc7-ppc7d\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:19:20.062252 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:20.062258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:19:20.062480 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.062344 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:19:20.062480 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.062353 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:19:20.062480 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.062367 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bb764ccc7-ppc7d: secret "image-registry-tls" not found Apr 28 19:19:20.062480 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.062439 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls podName:bda9f335-17a3-4fe7-8eb6-ba81a4068222 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:22.062423551 +0000 UTC m=+283.412864934 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls") pod "image-registry-5bb764ccc7-ppc7d" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222") : secret "image-registry-tls" not found Apr 28 19:19:20.062480 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.062452 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert podName:92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f nodeName:}" failed. No retries permitted until 2026-04-28 19:21:22.062446433 +0000 UTC m=+283.412887817 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j9zgn" (UID: "92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f") : secret "networking-console-plugin-cert" not found Apr 28 19:19:20.163559 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:20.163521 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:19:20.163814 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:20.163598 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:19:20.163814 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.163677 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:19:20.163814 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.163710 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:19:20.163814 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.163735 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls podName:8327b8b7-48d4-4d18-bec4-8cea6c826302 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:22.16372021 +0000 UTC m=+283.514161595 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls") pod "dns-default-c55mw" (UID: "8327b8b7-48d4-4d18-bec4-8cea6c826302") : secret "dns-default-metrics-tls" not found Apr 28 19:19:20.163814 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:20.163754 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert podName:b5ab40ee-0c46-43db-8a80-02e47728a72f nodeName:}" failed. No retries permitted until 2026-04-28 19:21:22.163742999 +0000 UTC m=+283.514184383 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert") pod "ingress-canary-wlbdc" (UID: "b5ab40ee-0c46-43db-8a80-02e47728a72f") : secret "canary-serving-cert" not found Apr 28 19:19:22.360758 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.360696 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" podUID="e649bde4-cf50-48a9-ad53-c5bbf78f92c5" containerName="addon-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/healthz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 28 19:19:22.397832 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.397786 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" podUID="a00e9848-ea32-4408-8214-1a5a27b0ffb7" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/healthz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 28 19:19:22.403239 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.403211 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" podUID="a00e9848-ea32-4408-8214-1a5a27b0ffb7" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 28 19:19:22.702354 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.702261 2578 generic.go:358] "Generic (PLEG): container finished" podID="a00e9848-ea32-4408-8214-1a5a27b0ffb7" containerID="e064e72558da68018ea368c3fef48903ae3a7d336ad8fa97610ef815465c3ac4" exitCode=1 Apr 28 19:19:22.702354 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.702337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" event={"ID":"a00e9848-ea32-4408-8214-1a5a27b0ffb7","Type":"ContainerDied","Data":"e064e72558da68018ea368c3fef48903ae3a7d336ad8fa97610ef815465c3ac4"} Apr 28 19:19:22.702757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.702735 2578 scope.go:117] "RemoveContainer" containerID="e064e72558da68018ea368c3fef48903ae3a7d336ad8fa97610ef815465c3ac4" Apr 28 19:19:22.703782 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.703760 2578 generic.go:358] "Generic (PLEG): container finished" podID="e649bde4-cf50-48a9-ad53-c5bbf78f92c5" containerID="4c46335da42639d40ba31c70141f5dccacb74347900465f6030e2598a38ce9c2" exitCode=255 Apr 28 19:19:22.703866 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.703807 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" event={"ID":"e649bde4-cf50-48a9-ad53-c5bbf78f92c5","Type":"ContainerDied","Data":"4c46335da42639d40ba31c70141f5dccacb74347900465f6030e2598a38ce9c2"} Apr 28 19:19:22.704070 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:22.704051 2578 scope.go:117] "RemoveContainer" containerID="4c46335da42639d40ba31c70141f5dccacb74347900465f6030e2598a38ce9c2" Apr 28 19:19:23.707853 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:23.707815 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" event={"ID":"a00e9848-ea32-4408-8214-1a5a27b0ffb7","Type":"ContainerStarted","Data":"fb4224e0491fa77eb5c1be0ef32e83e5517d49edcdd47d7bda7e0b2ea536d683"} Apr 28 19:19:23.708294 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:23.708135 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:19:23.709156 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:23.709124 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-857df48f6f-7m5wr" Apr 28 19:19:23.709464 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:23.709447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77ff884f47-h59gf" event={"ID":"e649bde4-cf50-48a9-ad53-c5bbf78f92c5","Type":"ContainerStarted","Data":"acc39fcdb6416517f1b2c043ea50b621fbc461b564df9bfe4c37a73dc2f54792"} Apr 28 19:19:26.191811 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:26.191715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c55mw" Apr 28 19:19:26.191811 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:26.191738 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:19:27.191303 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:27.191264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:19:36.895883 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.895850 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nghpr"] Apr 28 19:19:36.898927 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.898910 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:36.908370 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.908348 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8vs7f\"" Apr 28 19:19:36.908509 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.908369 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:19:36.908509 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.908398 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:19:36.908509 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.908350 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:19:36.908509 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.908369 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:19:36.934184 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.934163 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nghpr"] Apr 28 19:19:36.993165 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.993135 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9de6d54e-8eab-4890-b6e4-99648fd535fc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:36.993321 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.993188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6stlq\" (UniqueName: \"kubernetes.io/projected/9de6d54e-8eab-4890-b6e4-99648fd535fc-kube-api-access-6stlq\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:36.993321 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.993286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9de6d54e-8eab-4890-b6e4-99648fd535fc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:36.993435 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.993331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9de6d54e-8eab-4890-b6e4-99648fd535fc-data-volume\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:36.993435 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:36.993377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9de6d54e-8eab-4890-b6e4-99648fd535fc-crio-socket\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.094570 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.094540 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9de6d54e-8eab-4890-b6e4-99648fd535fc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.094760 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.094595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9de6d54e-8eab-4890-b6e4-99648fd535fc-data-volume\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.094760 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.094621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9de6d54e-8eab-4890-b6e4-99648fd535fc-crio-socket\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.094760 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.094679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9de6d54e-8eab-4890-b6e4-99648fd535fc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.094760 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.094706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6stlq\" (UniqueName: \"kubernetes.io/projected/9de6d54e-8eab-4890-b6e4-99648fd535fc-kube-api-access-6stlq\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.094982 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.094793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9de6d54e-8eab-4890-b6e4-99648fd535fc-crio-socket\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.094982 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.094952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9de6d54e-8eab-4890-b6e4-99648fd535fc-data-volume\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.095221 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.095203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9de6d54e-8eab-4890-b6e4-99648fd535fc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.097150 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.097128 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9de6d54e-8eab-4890-b6e4-99648fd535fc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.118746 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.118722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6stlq\" (UniqueName: \"kubernetes.io/projected/9de6d54e-8eab-4890-b6e4-99648fd535fc-kube-api-access-6stlq\") pod \"insights-runtime-extractor-nghpr\" (UID: \"9de6d54e-8eab-4890-b6e4-99648fd535fc\") " pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.207344 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.207272 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nghpr" Apr 28 19:19:37.383611 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.383573 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nghpr"] Apr 28 19:19:37.388140 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:19:37.388114 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de6d54e_8eab_4890_b6e4_99648fd535fc.slice/crio-1a162fad61077338b4639e8a6d2a5a2a39d5366094794521a70fe2be8c109569 WatchSource:0}: Error finding container 1a162fad61077338b4639e8a6d2a5a2a39d5366094794521a70fe2be8c109569: Status 404 returned error can't find the container with id 1a162fad61077338b4639e8a6d2a5a2a39d5366094794521a70fe2be8c109569 Apr 28 19:19:37.749278 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.749246 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nghpr" event={"ID":"9de6d54e-8eab-4890-b6e4-99648fd535fc","Type":"ContainerStarted","Data":"2d60956f8f4e6e7fc8e8cf837eb451604fbab06866992c8c84df3a14af62bcb5"} Apr 28 19:19:37.749278 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:37.749281 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nghpr" event={"ID":"9de6d54e-8eab-4890-b6e4-99648fd535fc","Type":"ContainerStarted","Data":"1a162fad61077338b4639e8a6d2a5a2a39d5366094794521a70fe2be8c109569"} Apr 28 19:19:38.753326 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:38.753287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nghpr" event={"ID":"9de6d54e-8eab-4890-b6e4-99648fd535fc","Type":"ContainerStarted","Data":"1e6b5700ba8f2691d4ba56549de8c080568eb6eef0b55815d02b608b91f0c9e6"} Apr 28 19:19:39.757165 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:39.757122 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nghpr" event={"ID":"9de6d54e-8eab-4890-b6e4-99648fd535fc","Type":"ContainerStarted","Data":"e7a0b0bd3939cd98dcffc95f02f23121566df5c239c51bee2829757b9ed37511"} Apr 28 19:19:39.781698 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:39.781648 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nghpr" podStartSLOduration=1.886989469 podStartE2EDuration="3.781614119s" podCreationTimestamp="2026-04-28 19:19:36 +0000 UTC" firstStartedPulling="2026-04-28 19:19:37.444965292 +0000 UTC m=+178.795406676" lastFinishedPulling="2026-04-28 19:19:39.339589938 +0000 UTC m=+180.690031326" observedRunningTime="2026-04-28 19:19:39.781430646 +0000 UTC m=+181.131872053" watchObservedRunningTime="2026-04-28 19:19:39.781614119 +0000 UTC m=+181.132055531" Apr 28 19:19:59.566382 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.566346 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5bb764ccc7-ppc7d"] Apr 28 19:19:59.566862 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:19:59.566560 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" podUID="bda9f335-17a3-4fe7-8eb6-ba81a4068222" Apr 28 19:19:59.802486 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.802457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:19:59.806509 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.806485 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:19:59.883832 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.883806 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-bound-sa-token\") pod \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " Apr 28 19:19:59.884012 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.883853 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pmwf\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-kube-api-access-7pmwf\") pod \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " Apr 28 19:19:59.884012 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.883876 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-installation-pull-secrets\") pod \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " Apr 28 19:19:59.884012 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.883900 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-certificates\") pod \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " Apr 28 19:19:59.884012 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.883932 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bda9f335-17a3-4fe7-8eb6-ba81a4068222-ca-trust-extracted\") pod \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " Apr 28 19:19:59.884012 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.883952 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-image-registry-private-configuration\") pod \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " Apr 28 19:19:59.884012 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.883988 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-trusted-ca\") pod \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\" (UID: \"bda9f335-17a3-4fe7-8eb6-ba81a4068222\") " Apr 28 19:19:59.884380 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.884350 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda9f335-17a3-4fe7-8eb6-ba81a4068222-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bda9f335-17a3-4fe7-8eb6-ba81a4068222" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:19:59.884448 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.884395 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bda9f335-17a3-4fe7-8eb6-ba81a4068222" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:59.884569 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.884546 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bda9f335-17a3-4fe7-8eb6-ba81a4068222" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:59.886274 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.886248 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-kube-api-access-7pmwf" (OuterVolumeSpecName: "kube-api-access-7pmwf") pod "bda9f335-17a3-4fe7-8eb6-ba81a4068222" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222"). InnerVolumeSpecName "kube-api-access-7pmwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:59.886274 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.886257 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "bda9f335-17a3-4fe7-8eb6-ba81a4068222" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:59.886409 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.886282 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bda9f335-17a3-4fe7-8eb6-ba81a4068222" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:59.886409 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.886334 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bda9f335-17a3-4fe7-8eb6-ba81a4068222" (UID: "bda9f335-17a3-4fe7-8eb6-ba81a4068222"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:59.985081 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.985045 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-certificates\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:19:59.985081 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.985077 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bda9f335-17a3-4fe7-8eb6-ba81a4068222-ca-trust-extracted\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:19:59.985081 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.985088 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-image-registry-private-configuration\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:19:59.985306 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.985099 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda9f335-17a3-4fe7-8eb6-ba81a4068222-trusted-ca\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:19:59.985306 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.985109 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-bound-sa-token\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:19:59.985306 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.985117 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7pmwf\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-kube-api-access-7pmwf\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:19:59.985306 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:19:59.985126 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bda9f335-17a3-4fe7-8eb6-ba81a4068222-installation-pull-secrets\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:20:00.804674 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:00.804614 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bb764ccc7-ppc7d" Apr 28 19:20:00.844873 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:00.844839 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5bb764ccc7-ppc7d"] Apr 28 19:20:00.854200 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:00.854177 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5bb764ccc7-ppc7d"] Apr 28 19:20:00.992821 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:00.992784 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bda9f335-17a3-4fe7-8eb6-ba81a4068222-registry-tls\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:20:01.194584 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.194546 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda9f335-17a3-4fe7-8eb6-ba81a4068222" path="/var/lib/kubelet/pods/bda9f335-17a3-4fe7-8eb6-ba81a4068222/volumes" Apr 28 19:20:01.826535 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.826502 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pgxqf"] Apr 28 19:20:01.830646 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.830617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.835093 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.835069 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-srbh6\"" Apr 28 19:20:01.835230 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.835068 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:20:01.835230 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.835074 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:20:01.836057 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.836035 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:20:01.836236 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.836081 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:20:01.836236 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.836131 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:20:01.836236 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.836155 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:20:01.899195 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-root\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.899349 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-wtmp\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.899349 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-tls\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.899349 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8445255-e1df-44af-92d4-781af5f9f6b1-metrics-client-ca\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.899349 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899303 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-textfile\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.899349 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.899538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhqw\" (UniqueName: \"kubernetes.io/projected/d8445255-e1df-44af-92d4-781af5f9f6b1-kube-api-access-8nhqw\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.899538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899412 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-sys\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:01.899538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:01.899447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-accelerators-collector-config\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.000591 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhqw\" (UniqueName: \"kubernetes.io/projected/d8445255-e1df-44af-92d4-781af5f9f6b1-kube-api-access-8nhqw\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.000788 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-sys\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.000788 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-accelerators-collector-config\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.000788 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-sys\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.000788 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-root\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.000788 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000750 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-root\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.000788 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-wtmp\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.001080 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-tls\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.001080 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8445255-e1df-44af-92d4-781af5f9f6b1-metrics-client-ca\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.001080 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-textfile\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.001080 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.001080 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.000923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-wtmp\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.001080 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:20:02.000951 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 28 19:20:02.001080 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:20:02.001015 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-tls podName:d8445255-e1df-44af-92d4-781af5f9f6b1 nodeName:}" failed. No retries permitted until 2026-04-28 19:20:02.500995697 +0000 UTC m=+203.851437100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-tls") pod "node-exporter-pgxqf" (UID: "d8445255-e1df-44af-92d4-781af5f9f6b1") : secret "node-exporter-tls" not found Apr 28 19:20:02.001454 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.001236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-textfile\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.001454 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.001259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-accelerators-collector-config\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.001454 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.001343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8445255-e1df-44af-92d4-781af5f9f6b1-metrics-client-ca\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.003406 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.003385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.012057 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.012033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhqw\" (UniqueName: \"kubernetes.io/projected/d8445255-e1df-44af-92d4-781af5f9f6b1-kube-api-access-8nhqw\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.505207 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.505173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-tls\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.507592 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.507566 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8445255-e1df-44af-92d4-781af5f9f6b1-node-exporter-tls\") pod \"node-exporter-pgxqf\" (UID: \"d8445255-e1df-44af-92d4-781af5f9f6b1\") " pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.743608 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.743570 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pgxqf" Apr 28 19:20:02.751779 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:20:02.751747 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8445255_e1df_44af_92d4_781af5f9f6b1.slice/crio-1fdb880e3c4085689fd23ff8a3d5de4f85485d288d0e909a94a0a3b12d961178 WatchSource:0}: Error finding container 1fdb880e3c4085689fd23ff8a3d5de4f85485d288d0e909a94a0a3b12d961178: Status 404 returned error can't find the container with id 1fdb880e3c4085689fd23ff8a3d5de4f85485d288d0e909a94a0a3b12d961178 Apr 28 19:20:02.810334 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:02.810300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pgxqf" event={"ID":"d8445255-e1df-44af-92d4-781af5f9f6b1","Type":"ContainerStarted","Data":"1fdb880e3c4085689fd23ff8a3d5de4f85485d288d0e909a94a0a3b12d961178"} Apr 28 19:20:03.814061 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:03.814033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pgxqf" event={"ID":"d8445255-e1df-44af-92d4-781af5f9f6b1","Type":"ContainerStarted","Data":"a8fe8484c792a9d357645396e60750b9b45a02b60ab6187ff562c740aa41cc21"} Apr 28 19:20:04.818112 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:04.818077 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8445255-e1df-44af-92d4-781af5f9f6b1" containerID="a8fe8484c792a9d357645396e60750b9b45a02b60ab6187ff562c740aa41cc21" exitCode=0 Apr 28 19:20:04.818541 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:04.818165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pgxqf" event={"ID":"d8445255-e1df-44af-92d4-781af5f9f6b1","Type":"ContainerDied","Data":"a8fe8484c792a9d357645396e60750b9b45a02b60ab6187ff562c740aa41cc21"} Apr 28 19:20:05.822585 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:05.822553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pgxqf" event={"ID":"d8445255-e1df-44af-92d4-781af5f9f6b1","Type":"ContainerStarted","Data":"4def1912a3eda866a86452efac2135c60353f75333f1c02bdc071b996948cfd8"} Apr 28 19:20:05.822585 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:05.822588 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pgxqf" event={"ID":"d8445255-e1df-44af-92d4-781af5f9f6b1","Type":"ContainerStarted","Data":"0a58410d143b9e8ca25d54f24e538bfaf0b2940df7072ac9e1eb4e69b29981d6"} Apr 28 19:20:05.851551 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:05.851502 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pgxqf" podStartSLOduration=3.853928562 podStartE2EDuration="4.851486776s" podCreationTimestamp="2026-04-28 19:20:01 +0000 UTC" firstStartedPulling="2026-04-28 19:20:02.753775501 +0000 UTC m=+204.104216885" lastFinishedPulling="2026-04-28 19:20:03.7513337 +0000 UTC m=+205.101775099" observedRunningTime="2026-04-28 19:20:05.850324683 +0000 UTC m=+207.200766089" watchObservedRunningTime="2026-04-28 19:20:05.851486776 +0000 UTC m=+207.201928181" Apr 28 19:20:12.368789 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:12.368747 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" podUID="67ae9b4c-6c63-4813-b08e-8ec2f3197cdf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:20:22.368862 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:22.368818 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" podUID="67ae9b4c-6c63-4813-b08e-8ec2f3197cdf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:20:32.368333 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:32.368290 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" podUID="67ae9b4c-6c63-4813-b08e-8ec2f3197cdf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:20:32.368857 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:32.368376 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" Apr 28 19:20:32.369042 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:32.369006 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"3629ee5738887da115828118dfe3688f78d98e7332020f936114e1bbd7016bfe"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 28 19:20:32.369115 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:32.369083 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" podUID="67ae9b4c-6c63-4813-b08e-8ec2f3197cdf" containerName="service-proxy" containerID="cri-o://3629ee5738887da115828118dfe3688f78d98e7332020f936114e1bbd7016bfe" gracePeriod=30 Apr 28 19:20:32.892460 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:32.892429 2578 generic.go:358] "Generic (PLEG): container finished" podID="67ae9b4c-6c63-4813-b08e-8ec2f3197cdf" containerID="3629ee5738887da115828118dfe3688f78d98e7332020f936114e1bbd7016bfe" exitCode=2 Apr 28 19:20:32.892624 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:32.892472 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" event={"ID":"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf","Type":"ContainerDied","Data":"3629ee5738887da115828118dfe3688f78d98e7332020f936114e1bbd7016bfe"} Apr 28 19:20:32.892624 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:32.892514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d4c6b76c-ncnd5" event={"ID":"67ae9b4c-6c63-4813-b08e-8ec2f3197cdf","Type":"ContainerStarted","Data":"42691aa7eaa0e72576ea90735b5cdfb480694625308aa6e618aa367500375072"} Apr 28 19:20:51.098786 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:51.098744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:20:51.101158 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:51.101137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96593340-195c-4a9b-8d15-babb74ebf1c6-metrics-certs\") pod \"network-metrics-daemon-2ssxm\" (UID: \"96593340-195c-4a9b-8d15-babb74ebf1c6\") " pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:20:51.196400 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:51.196371 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cnh2p\"" Apr 28 19:20:51.202655 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:51.202620 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ssxm" Apr 28 19:20:51.320741 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:51.320704 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2ssxm"] Apr 28 19:20:51.322703 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:20:51.322671 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96593340_195c_4a9b_8d15_babb74ebf1c6.slice/crio-460ec493a6e33e68787664f9cf1a384034241dd23b84eec87e7bb3f889c171e9 WatchSource:0}: Error finding container 460ec493a6e33e68787664f9cf1a384034241dd23b84eec87e7bb3f889c171e9: Status 404 returned error can't find the container with id 460ec493a6e33e68787664f9cf1a384034241dd23b84eec87e7bb3f889c171e9 Apr 28 19:20:51.940286 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:51.940237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ssxm" event={"ID":"96593340-195c-4a9b-8d15-babb74ebf1c6","Type":"ContainerStarted","Data":"460ec493a6e33e68787664f9cf1a384034241dd23b84eec87e7bb3f889c171e9"} Apr 28 19:20:52.945207 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:52.945167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ssxm" event={"ID":"96593340-195c-4a9b-8d15-babb74ebf1c6","Type":"ContainerStarted","Data":"1546debc633cd9cf61efae9c78c4aba0ac0d6da37a741def35cd435e5fb572b0"} Apr 28 19:20:52.945207 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:52.945211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ssxm" event={"ID":"96593340-195c-4a9b-8d15-babb74ebf1c6","Type":"ContainerStarted","Data":"e261f384fe3bf63641fe690642429c41c0bdbc56fbee4100cf9192d0730b4e4a"} Apr 28 19:20:52.963970 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:20:52.963911 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2ssxm" podStartSLOduration=253.023630398 podStartE2EDuration="4m13.963896804s" podCreationTimestamp="2026-04-28 19:16:39 +0000 UTC" firstStartedPulling="2026-04-28 19:20:51.324492563 +0000 UTC m=+252.674933947" lastFinishedPulling="2026-04-28 19:20:52.264758969 +0000 UTC m=+253.615200353" observedRunningTime="2026-04-28 19:20:52.963799274 +0000 UTC m=+254.314240682" watchObservedRunningTime="2026-04-28 19:20:52.963896804 +0000 UTC m=+254.314338241" Apr 28 19:21:18.683956 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:21:18.683904 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" podUID="92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f" Apr 28 19:21:19.013079 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:19.012998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:21:22.145400 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.145341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:21:22.147922 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.147893 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j9zgn\" (UID: \"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:21:22.246760 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.246709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:21:22.246942 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.246811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:21:22.249224 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.249192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8327b8b7-48d4-4d18-bec4-8cea6c826302-metrics-tls\") pod \"dns-default-c55mw\" (UID: \"8327b8b7-48d4-4d18-bec4-8cea6c826302\") " pod="openshift-dns/dns-default-c55mw" Apr 28 19:21:22.249342 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.249233 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5ab40ee-0c46-43db-8a80-02e47728a72f-cert\") pod \"ingress-canary-wlbdc\" (UID: \"b5ab40ee-0c46-43db-8a80-02e47728a72f\") " pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:21:22.295075 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.295046 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-524nx\"" Apr 28 19:21:22.296285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.296268 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jtx2j\"" Apr 28 19:21:22.302517 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.302503 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c55mw" Apr 28 19:21:22.302600 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.302506 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wlbdc" Apr 28 19:21:22.316290 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.316255 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bssxn\"" Apr 28 19:21:22.324517 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.324486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" Apr 28 19:21:22.504311 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:21:22.504279 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5ab40ee_0c46_43db_8a80_02e47728a72f.slice/crio-6873eefc5bfce7924c4a76f5c61a2784539152a754cb30ac64353461e813594e WatchSource:0}: Error finding container 6873eefc5bfce7924c4a76f5c61a2784539152a754cb30ac64353461e813594e: Status 404 returned error can't find the container with id 6873eefc5bfce7924c4a76f5c61a2784539152a754cb30ac64353461e813594e Apr 28 19:21:22.528712 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.528686 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wlbdc"] Apr 28 19:21:22.541115 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.541064 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn"] Apr 28 19:21:22.542322 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:21:22.542296 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92170fe7_3ead_4d08_90fa_aa8b6a8f3a4f.slice/crio-18c1829661ee0116c2d6f7f0afde7050c7d23fba8ba4d7fdb4a87b12c3114804 WatchSource:0}: Error finding container 18c1829661ee0116c2d6f7f0afde7050c7d23fba8ba4d7fdb4a87b12c3114804: Status 404 returned error can't find the container with id 18c1829661ee0116c2d6f7f0afde7050c7d23fba8ba4d7fdb4a87b12c3114804 Apr 28 19:21:22.611227 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:21:22.611185 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8327b8b7_48d4_4d18_bec4_8cea6c826302.slice/crio-b82cc9a50411a7d3328949b51d17591134d738076d7957a42e9e962b1bc456dc WatchSource:0}: Error finding container b82cc9a50411a7d3328949b51d17591134d738076d7957a42e9e962b1bc456dc: Status 404 returned error can't find the container with id b82cc9a50411a7d3328949b51d17591134d738076d7957a42e9e962b1bc456dc Apr 28 19:21:22.617707 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:22.616090 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c55mw"] Apr 28 19:21:23.023644 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:23.023601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" event={"ID":"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f","Type":"ContainerStarted","Data":"18c1829661ee0116c2d6f7f0afde7050c7d23fba8ba4d7fdb4a87b12c3114804"} Apr 28 19:21:23.024414 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:23.024385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c55mw" event={"ID":"8327b8b7-48d4-4d18-bec4-8cea6c826302","Type":"ContainerStarted","Data":"b82cc9a50411a7d3328949b51d17591134d738076d7957a42e9e962b1bc456dc"} Apr 28 19:21:23.025306 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:23.025277 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wlbdc" event={"ID":"b5ab40ee-0c46-43db-8a80-02e47728a72f","Type":"ContainerStarted","Data":"6873eefc5bfce7924c4a76f5c61a2784539152a754cb30ac64353461e813594e"} Apr 28 19:21:25.033171 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:25.033131 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c55mw" event={"ID":"8327b8b7-48d4-4d18-bec4-8cea6c826302","Type":"ContainerStarted","Data":"b34932d2e11c7e9d5afa2b081028a308004f39d1a2c97320504be0bded883310"} Apr 28 19:21:25.033171 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:25.033174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c55mw" event={"ID":"8327b8b7-48d4-4d18-bec4-8cea6c826302","Type":"ContainerStarted","Data":"c77acc58bcd7459a3f0ca93790fb1e1b4d298bd25bad188f1503fd5fcad6576e"} Apr 28 19:21:25.033697 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:25.033231 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-c55mw" Apr 28 19:21:25.034456 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:25.034427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wlbdc" event={"ID":"b5ab40ee-0c46-43db-8a80-02e47728a72f","Type":"ContainerStarted","Data":"345b48bb138a65e57351c651cbf6e5d355a3363a132bec4245709066a00c8e65"} Apr 28 19:21:25.035769 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:25.035729 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" event={"ID":"92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f","Type":"ContainerStarted","Data":"9905b7a863c33021843dc959e9cda5634564ef58a3e4c3568cb712c69921d504"} Apr 28 19:21:25.054890 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:25.054801 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-c55mw" podStartSLOduration=251.113981457 podStartE2EDuration="4m13.05478581s" podCreationTimestamp="2026-04-28 19:17:12 +0000 UTC" firstStartedPulling="2026-04-28 19:21:22.613270067 +0000 UTC m=+283.963711452" lastFinishedPulling="2026-04-28 19:21:24.554074415 +0000 UTC m=+285.904515805" observedRunningTime="2026-04-28 19:21:25.053658333 +0000 UTC m=+286.404099740" watchObservedRunningTime="2026-04-28 19:21:25.05478581 +0000 UTC m=+286.405227209" Apr 28 19:21:25.071374 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:25.071324 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j9zgn" podStartSLOduration=264.066418567 podStartE2EDuration="4m26.071310185s" podCreationTimestamp="2026-04-28 19:16:59 +0000 UTC" firstStartedPulling="2026-04-28 19:21:22.544108251 +0000 UTC m=+283.894549635" lastFinishedPulling="2026-04-28 19:21:24.548999854 +0000 UTC m=+285.899441253" observedRunningTime="2026-04-28 19:21:25.070783997 +0000 UTC m=+286.421225402" watchObservedRunningTime="2026-04-28 19:21:25.071310185 +0000 UTC m=+286.421751591" Apr 28 19:21:25.095421 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:25.095376 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wlbdc" podStartSLOduration=252.042730937 podStartE2EDuration="4m14.095360989s" podCreationTimestamp="2026-04-28 19:17:11 +0000 UTC" firstStartedPulling="2026-04-28 19:21:22.505996437 +0000 UTC m=+283.856437825" lastFinishedPulling="2026-04-28 19:21:24.558626491 +0000 UTC m=+285.909067877" observedRunningTime="2026-04-28 19:21:25.094947392 +0000 UTC m=+286.445388801" watchObservedRunningTime="2026-04-28 19:21:25.095360989 +0000 UTC m=+286.445802394" Apr 28 19:21:35.041794 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:35.041757 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-c55mw" Apr 28 19:21:39.099928 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:39.099877 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:21:39.100405 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:39.100352 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:21:39.103285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:21:39.103266 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:24:37.226497 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.226458 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6"] Apr 28 19:24:37.229368 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.229344 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" Apr 28 19:24:37.233776 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.233755 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 28 19:24:37.233906 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.233753 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:24:37.233985 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.233912 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-8bq8l\"" Apr 28 19:24:37.244670 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.244619 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6"] Apr 28 19:24:37.279267 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.279235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d722770d-3478-4b8b-9449-9c7f135c0e69-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-6thr6\" (UID: \"d722770d-3478-4b8b-9449-9c7f135c0e69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" Apr 28 19:24:37.279402 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.279272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddpw7\" (UniqueName: \"kubernetes.io/projected/d722770d-3478-4b8b-9449-9c7f135c0e69-kube-api-access-ddpw7\") pod \"cert-manager-operator-controller-manager-54b9655956-6thr6\" (UID: \"d722770d-3478-4b8b-9449-9c7f135c0e69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" Apr 28 19:24:37.380507 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.380461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d722770d-3478-4b8b-9449-9c7f135c0e69-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-6thr6\" (UID: \"d722770d-3478-4b8b-9449-9c7f135c0e69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" Apr 28 19:24:37.380507 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.380511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddpw7\" (UniqueName: \"kubernetes.io/projected/d722770d-3478-4b8b-9449-9c7f135c0e69-kube-api-access-ddpw7\") pod \"cert-manager-operator-controller-manager-54b9655956-6thr6\" (UID: \"d722770d-3478-4b8b-9449-9c7f135c0e69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" Apr 28 19:24:37.380871 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.380849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d722770d-3478-4b8b-9449-9c7f135c0e69-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-6thr6\" (UID: \"d722770d-3478-4b8b-9449-9c7f135c0e69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" Apr 28 19:24:37.392135 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.392098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddpw7\" (UniqueName: \"kubernetes.io/projected/d722770d-3478-4b8b-9449-9c7f135c0e69-kube-api-access-ddpw7\") pod \"cert-manager-operator-controller-manager-54b9655956-6thr6\" (UID: \"d722770d-3478-4b8b-9449-9c7f135c0e69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" Apr 28 19:24:37.539986 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.539887 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" Apr 28 19:24:37.671178 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.671145 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6"] Apr 28 19:24:37.674527 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:24:37.674498 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd722770d_3478_4b8b_9449_9c7f135c0e69.slice/crio-1b7c1d1d005cb59a02263f8ce59c64513ae02a9bb22519d56055b8236918221a WatchSource:0}: Error finding container 1b7c1d1d005cb59a02263f8ce59c64513ae02a9bb22519d56055b8236918221a: Status 404 returned error can't find the container with id 1b7c1d1d005cb59a02263f8ce59c64513ae02a9bb22519d56055b8236918221a Apr 28 19:24:37.676850 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:37.676837 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:24:38.530909 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:38.530861 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" event={"ID":"d722770d-3478-4b8b-9449-9c7f135c0e69","Type":"ContainerStarted","Data":"1b7c1d1d005cb59a02263f8ce59c64513ae02a9bb22519d56055b8236918221a"} Apr 28 19:24:40.539140 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:40.539054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" event={"ID":"d722770d-3478-4b8b-9449-9c7f135c0e69","Type":"ContainerStarted","Data":"0a3861aec87e8497dc440096f976ddd6af2d3db21a428f17905e000a472e9b81"} Apr 28 19:24:40.563542 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:40.563500 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-6thr6" podStartSLOduration=1.12221037 podStartE2EDuration="3.5634872s" podCreationTimestamp="2026-04-28 19:24:37 +0000 UTC" firstStartedPulling="2026-04-28 19:24:37.6769656 +0000 UTC m=+479.027406984" lastFinishedPulling="2026-04-28 19:24:40.118242416 +0000 UTC m=+481.468683814" observedRunningTime="2026-04-28 19:24:40.563198053 +0000 UTC m=+481.913639460" watchObservedRunningTime="2026-04-28 19:24:40.5634872 +0000 UTC m=+481.913928606" Apr 28 19:24:48.984245 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:48.984204 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-kc75c"] Apr 28 19:24:48.987351 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:48.987334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:48.990146 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:48.990105 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 28 19:24:48.991297 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:48.991278 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 28 19:24:48.991297 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:48.991288 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-xbbw6\"" Apr 28 19:24:49.048201 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.048173 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-kc75c"] Apr 28 19:24:49.065397 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.065364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/feb7fd40-fbb0-46a1-bbb7-9d20d9c30116-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-kc75c\" (UID: \"feb7fd40-fbb0-46a1-bbb7-9d20d9c30116\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:49.065534 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.065438 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jb98\" (UniqueName: \"kubernetes.io/projected/feb7fd40-fbb0-46a1-bbb7-9d20d9c30116-kube-api-access-5jb98\") pod \"cert-manager-webhook-587ccfb98-kc75c\" (UID: \"feb7fd40-fbb0-46a1-bbb7-9d20d9c30116\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:49.165923 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.165894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jb98\" (UniqueName: \"kubernetes.io/projected/feb7fd40-fbb0-46a1-bbb7-9d20d9c30116-kube-api-access-5jb98\") pod \"cert-manager-webhook-587ccfb98-kc75c\" (UID: \"feb7fd40-fbb0-46a1-bbb7-9d20d9c30116\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:49.166086 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.165939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/feb7fd40-fbb0-46a1-bbb7-9d20d9c30116-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-kc75c\" (UID: \"feb7fd40-fbb0-46a1-bbb7-9d20d9c30116\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:49.179147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.179116 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/feb7fd40-fbb0-46a1-bbb7-9d20d9c30116-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-kc75c\" (UID: \"feb7fd40-fbb0-46a1-bbb7-9d20d9c30116\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:49.179275 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.179221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jb98\" (UniqueName: \"kubernetes.io/projected/feb7fd40-fbb0-46a1-bbb7-9d20d9c30116-kube-api-access-5jb98\") pod \"cert-manager-webhook-587ccfb98-kc75c\" (UID: \"feb7fd40-fbb0-46a1-bbb7-9d20d9c30116\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:49.295903 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.295820 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:49.417025 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.416991 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-kc75c"] Apr 28 19:24:49.420273 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:24:49.420248 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb7fd40_fbb0_46a1_bbb7_9d20d9c30116.slice/crio-d625a2e834103e385ae655c46857438785653bb7ce8bee6c7d76f0553c4d4776 WatchSource:0}: Error finding container d625a2e834103e385ae655c46857438785653bb7ce8bee6c7d76f0553c4d4776: Status 404 returned error can't find the container with id d625a2e834103e385ae655c46857438785653bb7ce8bee6c7d76f0553c4d4776 Apr 28 19:24:49.564933 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:49.564843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" event={"ID":"feb7fd40-fbb0-46a1-bbb7-9d20d9c30116","Type":"ContainerStarted","Data":"d625a2e834103e385ae655c46857438785653bb7ce8bee6c7d76f0553c4d4776"} Apr 28 19:24:52.287851 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.287809 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-bclht"] Apr 28 19:24:52.291163 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.291140 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-bclht" Apr 28 19:24:52.293821 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.293795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-w59nf\"" Apr 28 19:24:52.300800 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.300778 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-bclht"] Apr 28 19:24:52.393622 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.393585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b5fs\" (UniqueName: \"kubernetes.io/projected/0a62b493-98cc-4213-9153-19a1a5a3a0e7-kube-api-access-5b5fs\") pod \"cert-manager-79c8d999ff-bclht\" (UID: \"0a62b493-98cc-4213-9153-19a1a5a3a0e7\") " pod="cert-manager/cert-manager-79c8d999ff-bclht" Apr 28 19:24:52.393820 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.393705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a62b493-98cc-4213-9153-19a1a5a3a0e7-bound-sa-token\") pod \"cert-manager-79c8d999ff-bclht\" (UID: \"0a62b493-98cc-4213-9153-19a1a5a3a0e7\") " pod="cert-manager/cert-manager-79c8d999ff-bclht" Apr 28 19:24:52.494078 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.494038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a62b493-98cc-4213-9153-19a1a5a3a0e7-bound-sa-token\") pod \"cert-manager-79c8d999ff-bclht\" (UID: \"0a62b493-98cc-4213-9153-19a1a5a3a0e7\") " pod="cert-manager/cert-manager-79c8d999ff-bclht" Apr 28 19:24:52.494291 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.494105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b5fs\" (UniqueName: \"kubernetes.io/projected/0a62b493-98cc-4213-9153-19a1a5a3a0e7-kube-api-access-5b5fs\") pod \"cert-manager-79c8d999ff-bclht\" (UID: \"0a62b493-98cc-4213-9153-19a1a5a3a0e7\") " pod="cert-manager/cert-manager-79c8d999ff-bclht" Apr 28 19:24:52.502904 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.502864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a62b493-98cc-4213-9153-19a1a5a3a0e7-bound-sa-token\") pod \"cert-manager-79c8d999ff-bclht\" (UID: \"0a62b493-98cc-4213-9153-19a1a5a3a0e7\") " pod="cert-manager/cert-manager-79c8d999ff-bclht" Apr 28 19:24:52.503005 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.502875 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b5fs\" (UniqueName: \"kubernetes.io/projected/0a62b493-98cc-4213-9153-19a1a5a3a0e7-kube-api-access-5b5fs\") pod \"cert-manager-79c8d999ff-bclht\" (UID: \"0a62b493-98cc-4213-9153-19a1a5a3a0e7\") " pod="cert-manager/cert-manager-79c8d999ff-bclht" Apr 28 19:24:52.574770 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.574685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" event={"ID":"feb7fd40-fbb0-46a1-bbb7-9d20d9c30116","Type":"ContainerStarted","Data":"4a9a484c8f344fdb1859db3128a043b037c9bf6229abc2b45a911c6e93d4a806"} Apr 28 19:24:52.574928 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.574822 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:24:52.591802 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.591753 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" podStartSLOduration=2.257661534 podStartE2EDuration="4.59173895s" podCreationTimestamp="2026-04-28 19:24:48 +0000 UTC" firstStartedPulling="2026-04-28 19:24:49.422013555 +0000 UTC m=+490.772454940" lastFinishedPulling="2026-04-28 19:24:51.756090969 +0000 UTC m=+493.106532356" observedRunningTime="2026-04-28 19:24:52.591032548 +0000 UTC m=+493.941473948" watchObservedRunningTime="2026-04-28 19:24:52.59173895 +0000 UTC m=+493.942180356" Apr 28 19:24:52.599336 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.599312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-bclht" Apr 28 19:24:52.736077 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:52.736045 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-bclht"] Apr 28 19:24:52.741247 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:24:52.741219 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a62b493_98cc_4213_9153_19a1a5a3a0e7.slice/crio-3ac46a6303e5573fac5218670d6bd4c0d4ff4edb71618953d3a29e0d5e6227e5 WatchSource:0}: Error finding container 3ac46a6303e5573fac5218670d6bd4c0d4ff4edb71618953d3a29e0d5e6227e5: Status 404 returned error can't find the container with id 3ac46a6303e5573fac5218670d6bd4c0d4ff4edb71618953d3a29e0d5e6227e5 Apr 28 19:24:53.579137 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:53.579099 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-bclht" event={"ID":"0a62b493-98cc-4213-9153-19a1a5a3a0e7","Type":"ContainerStarted","Data":"932db40555e29fdb9814ee8ca566cb76d9de32aba12c3840574bcf7d65ffc63c"} Apr 28 19:24:53.579137 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:53.579138 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-bclht" event={"ID":"0a62b493-98cc-4213-9153-19a1a5a3a0e7","Type":"ContainerStarted","Data":"3ac46a6303e5573fac5218670d6bd4c0d4ff4edb71618953d3a29e0d5e6227e5"} Apr 28 19:24:53.606522 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:53.606477 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-bclht" podStartSLOduration=1.6064613620000001 podStartE2EDuration="1.606461362s" podCreationTimestamp="2026-04-28 19:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:24:53.604918627 +0000 UTC m=+494.955360044" watchObservedRunningTime="2026-04-28 19:24:53.606461362 +0000 UTC m=+494.956902767" Apr 28 19:24:58.581869 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:24:58.581834 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-kc75c" Apr 28 19:25:17.468995 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.468950 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc"] Apr 28 19:25:17.478208 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.478174 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.484138 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.484109 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 28 19:25:17.485102 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.485081 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 28 19:25:17.485102 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.485089 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 28 19:25:17.485258 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.485117 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:25:17.485258 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.485089 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-z4smj\"" Apr 28 19:25:17.485258 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.485092 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 28 19:25:17.493663 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.493645 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc"] Apr 28 19:25:17.673674 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.673612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa308b9b-52b6-41bb-8041-bfb109b7754e-metrics-cert\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.673674 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.673675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fa308b9b-52b6-41bb-8041-bfb109b7754e-manager-config\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.673922 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.673715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa308b9b-52b6-41bb-8041-bfb109b7754e-cert\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.673922 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.673792 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdc9\" (UniqueName: \"kubernetes.io/projected/fa308b9b-52b6-41bb-8041-bfb109b7754e-kube-api-access-5jdc9\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.775042 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.774957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa308b9b-52b6-41bb-8041-bfb109b7754e-metrics-cert\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.775042 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.775006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fa308b9b-52b6-41bb-8041-bfb109b7754e-manager-config\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.775042 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.775032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa308b9b-52b6-41bb-8041-bfb109b7754e-cert\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.775253 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.775069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdc9\" (UniqueName: \"kubernetes.io/projected/fa308b9b-52b6-41bb-8041-bfb109b7754e-kube-api-access-5jdc9\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.775657 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.775604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fa308b9b-52b6-41bb-8041-bfb109b7754e-manager-config\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.777601 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.777579 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa308b9b-52b6-41bb-8041-bfb109b7754e-metrics-cert\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.777729 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.777709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa308b9b-52b6-41bb-8041-bfb109b7754e-cert\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.785063 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.785043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdc9\" (UniqueName: \"kubernetes.io/projected/fa308b9b-52b6-41bb-8041-bfb109b7754e-kube-api-access-5jdc9\") pod \"lws-controller-manager-79cf4cb497-p49cc\" (UID: \"fa308b9b-52b6-41bb-8041-bfb109b7754e\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.786937 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.786911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:17.934278 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:17.934254 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc"] Apr 28 19:25:17.936951 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:25:17.936926 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa308b9b_52b6_41bb_8041_bfb109b7754e.slice/crio-b002ad6fb0961f7876c3bd7aa89d9fa936e8fd96b29e91951cb27be5ef246248 WatchSource:0}: Error finding container b002ad6fb0961f7876c3bd7aa89d9fa936e8fd96b29e91951cb27be5ef246248: Status 404 returned error can't find the container with id b002ad6fb0961f7876c3bd7aa89d9fa936e8fd96b29e91951cb27be5ef246248 Apr 28 19:25:18.646315 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:18.646271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" event={"ID":"fa308b9b-52b6-41bb-8041-bfb109b7754e","Type":"ContainerStarted","Data":"b002ad6fb0961f7876c3bd7aa89d9fa936e8fd96b29e91951cb27be5ef246248"} Apr 28 19:25:20.654869 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:20.654832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" event={"ID":"fa308b9b-52b6-41bb-8041-bfb109b7754e","Type":"ContainerStarted","Data":"e8b1633ccad5af7a74ed3eea9f397b28e9c6ab710db32b6682a592872618457b"} Apr 28 19:25:20.655285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:20.654949 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:31.660774 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:31.660737 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" Apr 28 19:25:31.681102 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:25:31.681054 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-p49cc" podStartSLOduration=12.143510974 podStartE2EDuration="14.681038823s" podCreationTimestamp="2026-04-28 19:25:17 +0000 UTC" firstStartedPulling="2026-04-28 19:25:17.938688517 +0000 UTC m=+519.289129901" lastFinishedPulling="2026-04-28 19:25:20.476216366 +0000 UTC m=+521.826657750" observedRunningTime="2026-04-28 19:25:20.688565556 +0000 UTC m=+522.039006962" watchObservedRunningTime="2026-04-28 19:25:31.681038823 +0000 UTC m=+533.031480229" Apr 28 19:26:23.174439 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.174404 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm"] Apr 28 19:26:23.177521 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.177504 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.180181 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.180159 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 28 19:26:23.180181 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.180174 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 28 19:26:23.180181 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.180164 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 28 19:26:23.180436 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.180194 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 28 19:26:23.180436 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.180266 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xgz8s\"" Apr 28 19:26:23.185089 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.185067 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm"] Apr 28 19:26:23.274409 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.274370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c8b757a-a931-4552-98ef-8d189e70abc4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.274409 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.274405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kmpn\" (UniqueName: \"kubernetes.io/projected/8c8b757a-a931-4552-98ef-8d189e70abc4-kube-api-access-8kmpn\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.274602 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.274427 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8c8b757a-a931-4552-98ef-8d189e70abc4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.375437 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.375390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c8b757a-a931-4552-98ef-8d189e70abc4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.375437 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.375437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kmpn\" (UniqueName: \"kubernetes.io/projected/8c8b757a-a931-4552-98ef-8d189e70abc4-kube-api-access-8kmpn\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.375747 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.375461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8c8b757a-a931-4552-98ef-8d189e70abc4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.376258 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.376236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8c8b757a-a931-4552-98ef-8d189e70abc4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.378077 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.378058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c8b757a-a931-4552-98ef-8d189e70abc4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.384546 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.384524 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kmpn\" (UniqueName: \"kubernetes.io/projected/8c8b757a-a931-4552-98ef-8d189e70abc4-kube-api-access-8kmpn\") pod \"kuadrant-console-plugin-6c886788f8-jttlm\" (UID: \"8c8b757a-a931-4552-98ef-8d189e70abc4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.487619 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.487528 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" Apr 28 19:26:23.609893 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.609862 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm"] Apr 28 19:26:23.613066 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:26:23.613027 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c8b757a_a931_4552_98ef_8d189e70abc4.slice/crio-465db15171205dc600b4213a292698d1a9ff280950c529e5a918083f299dc177 WatchSource:0}: Error finding container 465db15171205dc600b4213a292698d1a9ff280950c529e5a918083f299dc177: Status 404 returned error can't find the container with id 465db15171205dc600b4213a292698d1a9ff280950c529e5a918083f299dc177 Apr 28 19:26:23.838884 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:23.838803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" event={"ID":"8c8b757a-a931-4552-98ef-8d189e70abc4","Type":"ContainerStarted","Data":"465db15171205dc600b4213a292698d1a9ff280950c529e5a918083f299dc177"} Apr 28 19:26:28.857547 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:28.857512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" event={"ID":"8c8b757a-a931-4552-98ef-8d189e70abc4","Type":"ContainerStarted","Data":"db58c790e3fd827e277c58e3d759697cfa6f3b65d664794781c0e5c98682c9a4"} Apr 28 19:26:39.117784 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:39.117755 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:26:39.119438 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:26:39.119409 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:27:04.480128 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.480025 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-jttlm" podStartSLOduration=36.962748566 podStartE2EDuration="41.48001072s" podCreationTimestamp="2026-04-28 19:26:23 +0000 UTC" firstStartedPulling="2026-04-28 19:26:23.614269846 +0000 UTC m=+584.964711229" lastFinishedPulling="2026-04-28 19:26:28.131532 +0000 UTC m=+589.481973383" observedRunningTime="2026-04-28 19:26:28.886334138 +0000 UTC m=+590.236775545" watchObservedRunningTime="2026-04-28 19:27:04.48001072 +0000 UTC m=+625.830452125" Apr 28 19:27:04.480781 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.480719 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-kjjnk"] Apr 28 19:27:04.483874 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.483856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:04.487029 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.487012 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 28 19:27:04.500763 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.500742 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-kjjnk"] Apr 28 19:27:04.562345 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.562314 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-kjjnk"] Apr 28 19:27:04.581329 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.581303 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfwbg\" (UniqueName: \"kubernetes.io/projected/61657b7e-2014-41bd-9f17-c3184bd201f7-kube-api-access-xfwbg\") pod \"limitador-limitador-67566c68b4-kjjnk\" (UID: \"61657b7e-2014-41bd-9f17-c3184bd201f7\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:04.581422 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.581353 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/61657b7e-2014-41bd-9f17-c3184bd201f7-config-file\") pod \"limitador-limitador-67566c68b4-kjjnk\" (UID: \"61657b7e-2014-41bd-9f17-c3184bd201f7\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:04.681955 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.681917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfwbg\" (UniqueName: \"kubernetes.io/projected/61657b7e-2014-41bd-9f17-c3184bd201f7-kube-api-access-xfwbg\") pod \"limitador-limitador-67566c68b4-kjjnk\" (UID: \"61657b7e-2014-41bd-9f17-c3184bd201f7\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:04.682134 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.681987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/61657b7e-2014-41bd-9f17-c3184bd201f7-config-file\") pod \"limitador-limitador-67566c68b4-kjjnk\" (UID: \"61657b7e-2014-41bd-9f17-c3184bd201f7\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:04.682583 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.682561 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/61657b7e-2014-41bd-9f17-c3184bd201f7-config-file\") pod \"limitador-limitador-67566c68b4-kjjnk\" (UID: \"61657b7e-2014-41bd-9f17-c3184bd201f7\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:04.691530 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.691503 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfwbg\" (UniqueName: \"kubernetes.io/projected/61657b7e-2014-41bd-9f17-c3184bd201f7-kube-api-access-xfwbg\") pod \"limitador-limitador-67566c68b4-kjjnk\" (UID: \"61657b7e-2014-41bd-9f17-c3184bd201f7\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:04.793743 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.793651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:04.920757 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.920733 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-kjjnk"] Apr 28 19:27:04.923158 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:27:04.923131 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61657b7e_2014_41bd_9f17_c3184bd201f7.slice/crio-330cc1b35662915a6ca82225dde111e1c0753b2c2c4b53223cfbf30bc06cd57f WatchSource:0}: Error finding container 330cc1b35662915a6ca82225dde111e1c0753b2c2c4b53223cfbf30bc06cd57f: Status 404 returned error can't find the container with id 330cc1b35662915a6ca82225dde111e1c0753b2c2c4b53223cfbf30bc06cd57f Apr 28 19:27:04.977544 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:04.977508 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" event={"ID":"61657b7e-2014-41bd-9f17-c3184bd201f7","Type":"ContainerStarted","Data":"330cc1b35662915a6ca82225dde111e1c0753b2c2c4b53223cfbf30bc06cd57f"} Apr 28 19:27:06.985118 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:06.985079 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" event={"ID":"61657b7e-2014-41bd-9f17-c3184bd201f7","Type":"ContainerStarted","Data":"f37f4254a225990128504bb68049a7657db6c089cebbcf0f359ebe0e9bb9eebf"} Apr 28 19:27:06.985589 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:06.985196 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:27:07.018169 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:07.014522 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" podStartSLOduration=1.671190942 podStartE2EDuration="3.014492934s" podCreationTimestamp="2026-04-28 19:27:04 +0000 UTC" firstStartedPulling="2026-04-28 19:27:04.925066385 +0000 UTC m=+626.275507773" lastFinishedPulling="2026-04-28 19:27:06.268368375 +0000 UTC m=+627.618809765" observedRunningTime="2026-04-28 19:27:07.011962259 +0000 UTC m=+628.362403664" watchObservedRunningTime="2026-04-28 19:27:07.014492934 +0000 UTC m=+628.364934343" Apr 28 19:27:17.990307 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:27:17.990276 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-kjjnk" Apr 28 19:28:59.160022 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.159988 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-tptx9"] Apr 28 19:28:59.163147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.163125 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:28:59.166000 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.165965 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-jxtbn\"" Apr 28 19:28:59.166296 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.166280 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:28:59.166940 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.166925 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 28 19:28:59.168694 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.168675 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:28:59.180606 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.180583 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tptx9"] Apr 28 19:28:59.236774 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.236751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/397c0b79-9faa-41e6-ba3e-eec0cc748074-tls-certs\") pod \"model-serving-api-86f7b4b499-tptx9\" (UID: \"397c0b79-9faa-41e6-ba3e-eec0cc748074\") " pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:28:59.236920 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.236820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n22z\" (UniqueName: \"kubernetes.io/projected/397c0b79-9faa-41e6-ba3e-eec0cc748074-kube-api-access-4n22z\") pod \"model-serving-api-86f7b4b499-tptx9\" (UID: \"397c0b79-9faa-41e6-ba3e-eec0cc748074\") " pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:28:59.337353 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.337326 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/397c0b79-9faa-41e6-ba3e-eec0cc748074-tls-certs\") pod \"model-serving-api-86f7b4b499-tptx9\" (UID: \"397c0b79-9faa-41e6-ba3e-eec0cc748074\") " pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:28:59.337508 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.337380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n22z\" (UniqueName: \"kubernetes.io/projected/397c0b79-9faa-41e6-ba3e-eec0cc748074-kube-api-access-4n22z\") pod \"model-serving-api-86f7b4b499-tptx9\" (UID: \"397c0b79-9faa-41e6-ba3e-eec0cc748074\") " pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:28:59.339877 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.339858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/397c0b79-9faa-41e6-ba3e-eec0cc748074-tls-certs\") pod \"model-serving-api-86f7b4b499-tptx9\" (UID: \"397c0b79-9faa-41e6-ba3e-eec0cc748074\") " pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:28:59.346094 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.346070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n22z\" (UniqueName: \"kubernetes.io/projected/397c0b79-9faa-41e6-ba3e-eec0cc748074-kube-api-access-4n22z\") pod \"model-serving-api-86f7b4b499-tptx9\" (UID: \"397c0b79-9faa-41e6-ba3e-eec0cc748074\") " pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:28:59.472989 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.472908 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:28:59.594500 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:28:59.594424 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tptx9"] Apr 28 19:28:59.597157 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:28:59.597130 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397c0b79_9faa_41e6_ba3e_eec0cc748074.slice/crio-5bebb370ea112be1ef3640ac3af5b3a074e7af0ffbcd2b5fdeb1e16598fca3f9 WatchSource:0}: Error finding container 5bebb370ea112be1ef3640ac3af5b3a074e7af0ffbcd2b5fdeb1e16598fca3f9: Status 404 returned error can't find the container with id 5bebb370ea112be1ef3640ac3af5b3a074e7af0ffbcd2b5fdeb1e16598fca3f9 Apr 28 19:29:00.331836 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:00.331795 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tptx9" event={"ID":"397c0b79-9faa-41e6-ba3e-eec0cc748074","Type":"ContainerStarted","Data":"5bebb370ea112be1ef3640ac3af5b3a074e7af0ffbcd2b5fdeb1e16598fca3f9"} Apr 28 19:29:02.339205 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:02.339167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tptx9" event={"ID":"397c0b79-9faa-41e6-ba3e-eec0cc748074","Type":"ContainerStarted","Data":"9f336f08a30ee7a4420b3e93b8edee845abce68c820b1b921e0180589518dd3f"} Apr 28 19:29:02.339557 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:02.339303 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:29:02.358648 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:02.358584 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-tptx9" podStartSLOduration=1.123087286 podStartE2EDuration="3.358571174s" podCreationTimestamp="2026-04-28 19:28:59 +0000 UTC" firstStartedPulling="2026-04-28 19:28:59.598999641 +0000 UTC m=+740.949441025" lastFinishedPulling="2026-04-28 19:29:01.834483529 +0000 UTC m=+743.184924913" observedRunningTime="2026-04-28 19:29:02.356933467 +0000 UTC m=+743.707374909" watchObservedRunningTime="2026-04-28 19:29:02.358571174 +0000 UTC m=+743.709012580" Apr 28 19:29:13.346317 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:13.346288 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-tptx9" Apr 28 19:29:15.404039 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.404009 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-qj2sh"] Apr 28 19:29:15.407076 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.407059 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qj2sh" Apr 28 19:29:15.409621 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.409604 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 28 19:29:15.409827 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.409815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vmkw2\"" Apr 28 19:29:15.414174 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.414150 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-qj2sh"] Apr 28 19:29:15.557670 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.557606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fhvh\" (UniqueName: \"kubernetes.io/projected/a72d274a-7708-4123-b7e0-0de0d2ce2e1a-kube-api-access-2fhvh\") pod \"s3-init-qj2sh\" (UID: \"a72d274a-7708-4123-b7e0-0de0d2ce2e1a\") " pod="kserve/s3-init-qj2sh" Apr 28 19:29:15.658355 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.658273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fhvh\" (UniqueName: \"kubernetes.io/projected/a72d274a-7708-4123-b7e0-0de0d2ce2e1a-kube-api-access-2fhvh\") pod \"s3-init-qj2sh\" (UID: \"a72d274a-7708-4123-b7e0-0de0d2ce2e1a\") " pod="kserve/s3-init-qj2sh" Apr 28 19:29:15.666891 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.666863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fhvh\" (UniqueName: \"kubernetes.io/projected/a72d274a-7708-4123-b7e0-0de0d2ce2e1a-kube-api-access-2fhvh\") pod \"s3-init-qj2sh\" (UID: \"a72d274a-7708-4123-b7e0-0de0d2ce2e1a\") " pod="kserve/s3-init-qj2sh" Apr 28 19:29:15.716713 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.716684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qj2sh" Apr 28 19:29:15.833783 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:15.833750 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-qj2sh"] Apr 28 19:29:15.836826 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:29:15.836796 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda72d274a_7708_4123_b7e0_0de0d2ce2e1a.slice/crio-b3c16061f9ac9bdbcb0ce5969686a3b24b74a5dc3272aa864034a62f0fd70eee WatchSource:0}: Error finding container b3c16061f9ac9bdbcb0ce5969686a3b24b74a5dc3272aa864034a62f0fd70eee: Status 404 returned error can't find the container with id b3c16061f9ac9bdbcb0ce5969686a3b24b74a5dc3272aa864034a62f0fd70eee Apr 28 19:29:16.382428 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:16.382386 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qj2sh" event={"ID":"a72d274a-7708-4123-b7e0-0de0d2ce2e1a","Type":"ContainerStarted","Data":"b3c16061f9ac9bdbcb0ce5969686a3b24b74a5dc3272aa864034a62f0fd70eee"} Apr 28 19:29:20.399336 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:20.399294 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qj2sh" event={"ID":"a72d274a-7708-4123-b7e0-0de0d2ce2e1a","Type":"ContainerStarted","Data":"62e99161b63a39d470c9652329c785de8379a1cd48fc725540ccaed21636c464"} Apr 28 19:29:20.416689 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:20.416617 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-qj2sh" podStartSLOduration=0.997064222 podStartE2EDuration="5.416602573s" podCreationTimestamp="2026-04-28 19:29:15 +0000 UTC" firstStartedPulling="2026-04-28 19:29:15.838375612 +0000 UTC m=+757.188816995" lastFinishedPulling="2026-04-28 19:29:20.257913959 +0000 UTC m=+761.608355346" observedRunningTime="2026-04-28 19:29:20.414777764 +0000 UTC m=+761.765219190" watchObservedRunningTime="2026-04-28 19:29:20.416602573 +0000 UTC m=+761.767043980" Apr 28 19:29:23.410364 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:23.410331 2578 generic.go:358] "Generic (PLEG): container finished" podID="a72d274a-7708-4123-b7e0-0de0d2ce2e1a" containerID="62e99161b63a39d470c9652329c785de8379a1cd48fc725540ccaed21636c464" exitCode=0 Apr 28 19:29:23.410729 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:23.410385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qj2sh" event={"ID":"a72d274a-7708-4123-b7e0-0de0d2ce2e1a","Type":"ContainerDied","Data":"62e99161b63a39d470c9652329c785de8379a1cd48fc725540ccaed21636c464"} Apr 28 19:29:24.547070 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:24.547044 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qj2sh" Apr 28 19:29:24.628083 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:24.628054 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fhvh\" (UniqueName: \"kubernetes.io/projected/a72d274a-7708-4123-b7e0-0de0d2ce2e1a-kube-api-access-2fhvh\") pod \"a72d274a-7708-4123-b7e0-0de0d2ce2e1a\" (UID: \"a72d274a-7708-4123-b7e0-0de0d2ce2e1a\") " Apr 28 19:29:24.630286 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:24.630266 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72d274a-7708-4123-b7e0-0de0d2ce2e1a-kube-api-access-2fhvh" (OuterVolumeSpecName: "kube-api-access-2fhvh") pod "a72d274a-7708-4123-b7e0-0de0d2ce2e1a" (UID: "a72d274a-7708-4123-b7e0-0de0d2ce2e1a"). InnerVolumeSpecName "kube-api-access-2fhvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:29:24.728928 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:24.728861 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2fhvh\" (UniqueName: \"kubernetes.io/projected/a72d274a-7708-4123-b7e0-0de0d2ce2e1a-kube-api-access-2fhvh\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:29:25.417729 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:25.417703 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qj2sh" Apr 28 19:29:25.417897 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:25.417702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qj2sh" event={"ID":"a72d274a-7708-4123-b7e0-0de0d2ce2e1a","Type":"ContainerDied","Data":"b3c16061f9ac9bdbcb0ce5969686a3b24b74a5dc3272aa864034a62f0fd70eee"} Apr 28 19:29:25.417897 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:29:25.417813 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3c16061f9ac9bdbcb0ce5969686a3b24b74a5dc3272aa864034a62f0fd70eee" Apr 28 19:30:02.546884 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.546802 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d"] Apr 28 19:30:02.547310 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.547110 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a72d274a-7708-4123-b7e0-0de0d2ce2e1a" containerName="s3-init" Apr 28 19:30:02.547310 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.547124 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d274a-7708-4123-b7e0-0de0d2ce2e1a" containerName="s3-init" Apr 28 19:30:02.547310 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.547183 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a72d274a-7708-4123-b7e0-0de0d2ce2e1a" containerName="s3-init" Apr 28 19:30:02.550033 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.550017 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.552551 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.552529 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:30:02.553660 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.553609 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:30:02.553772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.553683 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-vz4n6\"" Apr 28 19:30:02.553772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.553738 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 28 19:30:02.561451 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.561427 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d"] Apr 28 19:30:02.599247 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.599216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-tmp-dir\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.599380 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.599271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-dshm\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.599380 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.599298 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35878cf7-e030-4e54-95e7-2ef2ff42df25-tls-certs\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.599380 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.599324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.599380 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.599348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-home\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.599538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.599379 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-model-cache\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.599538 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.599427 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9c5m\" (UniqueName: \"kubernetes.io/projected/35878cf7-e030-4e54-95e7-2ef2ff42df25-kube-api-access-c9c5m\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.700730 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.700698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-model-cache\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.700873 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.700739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9c5m\" (UniqueName: \"kubernetes.io/projected/35878cf7-e030-4e54-95e7-2ef2ff42df25-kube-api-access-c9c5m\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.700873 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.700854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-tmp-dir\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.700958 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.700913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-dshm\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.700958 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.700935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35878cf7-e030-4e54-95e7-2ef2ff42df25-tls-certs\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.701055 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.700969 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.701055 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.701008 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-home\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.701157 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.701058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-model-cache\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.701218 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.701165 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-tmp-dir\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.701323 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.701303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-home\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.701370 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.701340 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.703237 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.703218 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-dshm\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.703480 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.703463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35878cf7-e030-4e54-95e7-2ef2ff42df25-tls-certs\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.709073 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.709050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9c5m\" (UniqueName: \"kubernetes.io/projected/35878cf7-e030-4e54-95e7-2ef2ff42df25-kube-api-access-c9c5m\") pod \"scheduler-inline-config-test-kserve-5b88fd65b4-trp8d\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.861413 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.861375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:30:02.993517 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.993493 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d"] Apr 28 19:30:02.996359 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:30:02.996313 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35878cf7_e030_4e54_95e7_2ef2ff42df25.slice/crio-98587dff95bfc2f69c395059be8246339d695ba648d93901b70d90acbe3fd449 WatchSource:0}: Error finding container 98587dff95bfc2f69c395059be8246339d695ba648d93901b70d90acbe3fd449: Status 404 returned error can't find the container with id 98587dff95bfc2f69c395059be8246339d695ba648d93901b70d90acbe3fd449 Apr 28 19:30:02.998273 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:02.998259 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:30:03.532661 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:03.532611 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" event={"ID":"35878cf7-e030-4e54-95e7-2ef2ff42df25","Type":"ContainerStarted","Data":"98587dff95bfc2f69c395059be8246339d695ba648d93901b70d90acbe3fd449"} Apr 28 19:30:06.546401 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:06.546305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" event={"ID":"35878cf7-e030-4e54-95e7-2ef2ff42df25","Type":"ContainerStarted","Data":"271cda1093770c6ebab22b8416a06c35037280bb7ee9eb498d55515a6703d5be"} Apr 28 19:30:57.965874 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:57.965841 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt"] Apr 28 19:30:57.969596 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:57.969577 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:57.972270 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:57.972247 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 28 19:30:57.978336 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:57.978310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt"] Apr 28 19:30:58.017834 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.017806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.017952 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.017843 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.017952 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.017870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.017952 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.017926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.018076 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.017958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.018076 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.017996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.018076 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.018026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kn4h\" (UniqueName: \"kubernetes.io/projected/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kube-api-access-4kn4h\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119279 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119422 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119422 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119422 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119320 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119422 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119422 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119716 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kn4h\" (UniqueName: \"kubernetes.io/projected/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kube-api-access-4kn4h\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119800 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119848 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119895 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119873 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.119975 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.119924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.121809 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.121786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.121994 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.121973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.127291 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.127267 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kn4h\" (UniqueName: \"kubernetes.io/projected/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kube-api-access-4kn4h\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.280208 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.280130 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:30:58.399958 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.399924 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt"] Apr 28 19:30:58.403600 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:30:58.403567 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda257c2b9_c93f_4984_ab84_bfb0cc4ff5cb.slice/crio-42e32a49245e86c8ca21607dc289f4b7d7167d74f7414cb5bdc6c5b3c30e1274 WatchSource:0}: Error finding container 42e32a49245e86c8ca21607dc289f4b7d7167d74f7414cb5bdc6c5b3c30e1274: Status 404 returned error can't find the container with id 42e32a49245e86c8ca21607dc289f4b7d7167d74f7414cb5bdc6c5b3c30e1274 Apr 28 19:30:58.710582 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.710546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" event={"ID":"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb","Type":"ContainerStarted","Data":"cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe"} Apr 28 19:30:58.710582 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:30:58.710586 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" event={"ID":"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb","Type":"ContainerStarted","Data":"42e32a49245e86c8ca21607dc289f4b7d7167d74f7414cb5bdc6c5b3c30e1274"} Apr 28 19:31:39.136932 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:31:39.136908 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:31:39.139412 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:31:39.139389 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:36:06.680478 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:06.680442 2578 generic.go:358] "Generic (PLEG): container finished" podID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" containerID="cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe" exitCode=0 Apr 28 19:36:06.680998 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:06.680522 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" event={"ID":"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb","Type":"ContainerDied","Data":"cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe"} Apr 28 19:36:06.681711 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:06.681686 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:36:08.689739 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:08.689701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" event={"ID":"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb","Type":"ContainerStarted","Data":"ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2"} Apr 28 19:36:08.709250 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:08.709200 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" podStartSLOduration=310.598939442 podStartE2EDuration="5m11.709187486s" podCreationTimestamp="2026-04-28 19:30:57 +0000 UTC" firstStartedPulling="2026-04-28 19:36:06.681804058 +0000 UTC m=+1168.032245442" lastFinishedPulling="2026-04-28 19:36:07.792052102 +0000 UTC m=+1169.142493486" observedRunningTime="2026-04-28 19:36:08.707705582 +0000 UTC m=+1170.058146989" watchObservedRunningTime="2026-04-28 19:36:08.709187486 +0000 UTC m=+1170.059628893" Apr 28 19:36:18.280891 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:18.280854 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:36:18.281255 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:18.280944 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:36:18.293015 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:18.292993 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:36:18.731199 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:18.731174 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:36:19.751514 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:19.751483 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt"] Apr 28 19:36:19.775361 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:36:19.775333 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs: secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 28 19:36:19.775494 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:36:19.775406 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs podName:a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb nodeName:}" failed. No retries permitted until 2026-04-28 19:36:20.275391581 +0000 UTC m=+1181.625832965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs") pod "llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb") : secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 28 19:36:20.279947 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:36:20.279902 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs: secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 28 19:36:20.280122 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:36:20.279981 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs podName:a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb nodeName:}" failed. No retries permitted until 2026-04-28 19:36:21.279967792 +0000 UTC m=+1182.630409176 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs") pod "llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb") : secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 28 19:36:21.288800 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:36:21.288768 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs: secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 28 19:36:21.289184 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:36:21.288841 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs podName:a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb nodeName:}" failed. No retries permitted until 2026-04-28 19:36:23.288826325 +0000 UTC m=+1184.639267709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs") pod "llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb") : secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 28 19:36:21.728899 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:21.728856 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" podUID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" containerName="main" containerID="cri-o://ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2" gracePeriod=30 Apr 28 19:36:21.973780 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:21.973749 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:36:22.094506 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094434 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-dshm\") pod \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " Apr 28 19:36:22.094506 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094472 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-home\") pod \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " Apr 28 19:36:22.094506 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094500 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs\") pod \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " Apr 28 19:36:22.094805 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094534 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-model-cache\") pod \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " Apr 28 19:36:22.094805 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094568 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kn4h\" (UniqueName: \"kubernetes.io/projected/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kube-api-access-4kn4h\") pod \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " Apr 28 19:36:22.094805 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094602 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tmp-dir\") pod \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " Apr 28 19:36:22.094805 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094648 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kserve-provision-location\") pod \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\" (UID: \"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb\") " Apr 28 19:36:22.094805 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094755 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-home" (OuterVolumeSpecName: "home") pod "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:36:22.095009 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094864 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-home\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:36:22.095009 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094879 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-model-cache" (OuterVolumeSpecName: "model-cache") pod "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:36:22.095009 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.094900 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:36:22.096825 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.096793 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-dshm" (OuterVolumeSpecName: "dshm") pod "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:36:22.096934 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.096844 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:36:22.097061 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.097040 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kube-api-access-4kn4h" (OuterVolumeSpecName: "kube-api-access-4kn4h") pod "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb"). InnerVolumeSpecName "kube-api-access-4kn4h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:36:22.149107 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.149076 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" (UID: "a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:36:22.195693 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.195672 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-dshm\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:36:22.195693 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.195691 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:36:22.195830 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.195700 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-model-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:36:22.195830 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.195709 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kn4h\" (UniqueName: \"kubernetes.io/projected/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kube-api-access-4kn4h\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:36:22.195830 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.195719 2578 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-tmp-dir\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:36:22.195830 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.195727 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:36:22.733213 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.733181 2578 generic.go:358] "Generic (PLEG): container finished" podID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" containerID="ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2" exitCode=0 Apr 28 19:36:22.733661 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.733252 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" Apr 28 19:36:22.733661 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.733268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" event={"ID":"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb","Type":"ContainerDied","Data":"ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2"} Apr 28 19:36:22.733661 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.733307 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt" event={"ID":"a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb","Type":"ContainerDied","Data":"42e32a49245e86c8ca21607dc289f4b7d7167d74f7414cb5bdc6c5b3c30e1274"} Apr 28 19:36:22.733661 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.733325 2578 scope.go:117] "RemoveContainer" containerID="ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2" Apr 28 19:36:22.742504 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.742485 2578 scope.go:117] "RemoveContainer" containerID="cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe" Apr 28 19:36:22.751972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.751953 2578 scope.go:117] "RemoveContainer" containerID="ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2" Apr 28 19:36:22.752234 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:36:22.752215 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2\": container with ID starting with ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2 not found: ID does not exist" containerID="ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2" Apr 28 19:36:22.752285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.752245 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2"} err="failed to get container status \"ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2\": rpc error: code = NotFound desc = could not find container \"ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2\": container with ID starting with ca4e5e1d24433cd19ca39ad242af904edf72638826926597cd82d88379ffe7c2 not found: ID does not exist" Apr 28 19:36:22.752331 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.752289 2578 scope.go:117] "RemoveContainer" containerID="cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe" Apr 28 19:36:22.752576 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:36:22.752491 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe\": container with ID starting with cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe not found: ID does not exist" containerID="cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe" Apr 28 19:36:22.752576 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.752527 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe"} err="failed to get container status \"cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe\": rpc error: code = NotFound desc = could not find container \"cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe\": container with ID starting with cdf51cf30ce080994ab8e9efb876e0fb904dc2eb74a02ef40bbc1ab7c546aafe not found: ID does not exist" Apr 28 19:36:22.755067 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.755044 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt"] Apr 28 19:36:22.758203 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:22.758178 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-fc4c49cf6-kfsvt"] Apr 28 19:36:23.195972 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:23.195942 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" path="/var/lib/kubelet/pods/a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb/volumes" Apr 28 19:36:39.160455 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:39.160423 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:36:39.163235 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:36:39.163214 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:41:39.179794 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:41:39.179767 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:41:39.183349 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:41:39.183332 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:44:38.270607 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:38.270569 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d"] Apr 28 19:44:38.271169 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:38.270874 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" podUID="35878cf7-e030-4e54-95e7-2ef2ff42df25" containerName="storage-initializer" containerID="cri-o://271cda1093770c6ebab22b8416a06c35037280bb7ee9eb498d55515a6703d5be" gracePeriod=30 Apr 28 19:44:55.367839 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.367802 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7"] Apr 28 19:44:55.368268 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.368192 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" containerName="main" Apr 28 19:44:55.368268 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.368206 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" containerName="main" Apr 28 19:44:55.368268 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.368232 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" containerName="storage-initializer" Apr 28 19:44:55.368268 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.368242 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" containerName="storage-initializer" Apr 28 19:44:55.368451 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.368304 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a257c2b9-c93f-4984-ab84-bfb0cc4ff5cb" containerName="main" Apr 28 19:44:55.371539 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.371521 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.373919 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.373897 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 28 19:44:55.381424 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.381402 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7"] Apr 28 19:44:55.458430 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.458398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.458430 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.458442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.458626 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.458464 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.458626 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.458521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-dshm\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.458626 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.458565 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-home\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.458626 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.458595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-model-cache\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.458769 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.458659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhzz\" (UniqueName: \"kubernetes.io/projected/f71456e7-6565-4dae-8f3d-b1eef493f050-kube-api-access-mmhzz\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559271 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-home\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559451 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-model-cache\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559451 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhzz\" (UniqueName: \"kubernetes.io/projected/f71456e7-6565-4dae-8f3d-b1eef493f050-kube-api-access-mmhzz\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559451 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559607 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559607 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559607 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-dshm\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559767 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-home\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559823 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-model-cache\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.559988 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.559960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.560162 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.560141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.562133 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.562104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-dshm\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.562416 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.562397 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.569052 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.569024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhzz\" (UniqueName: \"kubernetes.io/projected/f71456e7-6565-4dae-8f3d-b1eef493f050-kube-api-access-mmhzz\") pod \"scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.682048 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.681951 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:44:55.804288 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.804266 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7"] Apr 28 19:44:55.806662 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:44:55.806613 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf71456e7_6565_4dae_8f3d_b1eef493f050.slice/crio-865374ad45c13354daef8e56ddebc17c2af1cad571394f71fbe7cb7d9db184a7 WatchSource:0}: Error finding container 865374ad45c13354daef8e56ddebc17c2af1cad571394f71fbe7cb7d9db184a7: Status 404 returned error can't find the container with id 865374ad45c13354daef8e56ddebc17c2af1cad571394f71fbe7cb7d9db184a7 Apr 28 19:44:55.808424 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:55.808405 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:44:56.315645 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:56.315602 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" event={"ID":"f71456e7-6565-4dae-8f3d-b1eef493f050","Type":"ContainerStarted","Data":"cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611"} Apr 28 19:44:56.315816 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:44:56.315655 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" event={"ID":"f71456e7-6565-4dae-8f3d-b1eef493f050","Type":"ContainerStarted","Data":"865374ad45c13354daef8e56ddebc17c2af1cad571394f71fbe7cb7d9db184a7"} Apr 28 19:45:08.353004 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.352977 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-inline-config-test-kserve-5b88fd65b4-trp8d_35878cf7-e030-4e54-95e7-2ef2ff42df25/storage-initializer/0.log" Apr 28 19:45:08.353413 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.353020 2578 generic.go:358] "Generic (PLEG): container finished" podID="35878cf7-e030-4e54-95e7-2ef2ff42df25" containerID="271cda1093770c6ebab22b8416a06c35037280bb7ee9eb498d55515a6703d5be" exitCode=137 Apr 28 19:45:08.353413 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.353097 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" event={"ID":"35878cf7-e030-4e54-95e7-2ef2ff42df25","Type":"ContainerDied","Data":"271cda1093770c6ebab22b8416a06c35037280bb7ee9eb498d55515a6703d5be"} Apr 28 19:45:08.455721 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.455694 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-inline-config-test-kserve-5b88fd65b4-trp8d_35878cf7-e030-4e54-95e7-2ef2ff42df25/storage-initializer/0.log" Apr 28 19:45:08.455829 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.455763 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:45:08.551151 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551058 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-dshm\") pod \"35878cf7-e030-4e54-95e7-2ef2ff42df25\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " Apr 28 19:45:08.551151 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551130 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-kserve-provision-location\") pod \"35878cf7-e030-4e54-95e7-2ef2ff42df25\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " Apr 28 19:45:08.551390 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551159 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9c5m\" (UniqueName: \"kubernetes.io/projected/35878cf7-e030-4e54-95e7-2ef2ff42df25-kube-api-access-c9c5m\") pod \"35878cf7-e030-4e54-95e7-2ef2ff42df25\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " Apr 28 19:45:08.551390 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551191 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-model-cache\") pod \"35878cf7-e030-4e54-95e7-2ef2ff42df25\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " Apr 28 19:45:08.551390 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551218 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-home\") pod \"35878cf7-e030-4e54-95e7-2ef2ff42df25\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " Apr 28 19:45:08.551390 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551369 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35878cf7-e030-4e54-95e7-2ef2ff42df25-tls-certs\") pod \"35878cf7-e030-4e54-95e7-2ef2ff42df25\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " Apr 28 19:45:08.551589 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551418 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-home" (OuterVolumeSpecName: "home") pod "35878cf7-e030-4e54-95e7-2ef2ff42df25" (UID: "35878cf7-e030-4e54-95e7-2ef2ff42df25"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:45:08.551589 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551447 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-tmp-dir\") pod \"35878cf7-e030-4e54-95e7-2ef2ff42df25\" (UID: \"35878cf7-e030-4e54-95e7-2ef2ff42df25\") " Apr 28 19:45:08.551589 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551466 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-model-cache" (OuterVolumeSpecName: "model-cache") pod "35878cf7-e030-4e54-95e7-2ef2ff42df25" (UID: "35878cf7-e030-4e54-95e7-2ef2ff42df25"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:45:08.551766 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551697 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-home\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:45:08.551766 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551715 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-model-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:45:08.551766 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.551734 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "35878cf7-e030-4e54-95e7-2ef2ff42df25" (UID: "35878cf7-e030-4e54-95e7-2ef2ff42df25"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:45:08.553405 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.553379 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-dshm" (OuterVolumeSpecName: "dshm") pod "35878cf7-e030-4e54-95e7-2ef2ff42df25" (UID: "35878cf7-e030-4e54-95e7-2ef2ff42df25"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:45:08.553508 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.553459 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35878cf7-e030-4e54-95e7-2ef2ff42df25-kube-api-access-c9c5m" (OuterVolumeSpecName: "kube-api-access-c9c5m") pod "35878cf7-e030-4e54-95e7-2ef2ff42df25" (UID: "35878cf7-e030-4e54-95e7-2ef2ff42df25"). InnerVolumeSpecName "kube-api-access-c9c5m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:45:08.553804 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.553777 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35878cf7-e030-4e54-95e7-2ef2ff42df25-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "35878cf7-e030-4e54-95e7-2ef2ff42df25" (UID: "35878cf7-e030-4e54-95e7-2ef2ff42df25"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:45:08.587573 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.587542 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "35878cf7-e030-4e54-95e7-2ef2ff42df25" (UID: "35878cf7-e030-4e54-95e7-2ef2ff42df25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:45:08.652868 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.652838 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-dshm\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:45:08.652868 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.652867 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:45:08.653045 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.652879 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9c5m\" (UniqueName: \"kubernetes.io/projected/35878cf7-e030-4e54-95e7-2ef2ff42df25-kube-api-access-c9c5m\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:45:08.653045 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.652889 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35878cf7-e030-4e54-95e7-2ef2ff42df25-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:45:08.653045 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:08.652897 2578 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35878cf7-e030-4e54-95e7-2ef2ff42df25-tmp-dir\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:45:09.357604 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:09.357578 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-inline-config-test-kserve-5b88fd65b4-trp8d_35878cf7-e030-4e54-95e7-2ef2ff42df25/storage-initializer/0.log" Apr 28 19:45:09.358057 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:09.357700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" event={"ID":"35878cf7-e030-4e54-95e7-2ef2ff42df25","Type":"ContainerDied","Data":"98587dff95bfc2f69c395059be8246339d695ba648d93901b70d90acbe3fd449"} Apr 28 19:45:09.358057 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:09.357734 2578 scope.go:117] "RemoveContainer" containerID="271cda1093770c6ebab22b8416a06c35037280bb7ee9eb498d55515a6703d5be" Apr 28 19:45:09.358057 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:09.357736 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d" Apr 28 19:45:09.390675 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:09.390648 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d"] Apr 28 19:45:09.393185 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:09.393162 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5b88fd65b4-trp8d"] Apr 28 19:45:11.195535 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:45:11.195502 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35878cf7-e030-4e54-95e7-2ef2ff42df25" path="/var/lib/kubelet/pods/35878cf7-e030-4e54-95e7-2ef2ff42df25/volumes" Apr 28 19:46:39.199772 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:46:39.199670 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:46:39.206585 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:46:39.203849 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:48:35.996310 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:35.996274 2578 generic.go:358] "Generic (PLEG): container finished" podID="f71456e7-6565-4dae-8f3d-b1eef493f050" containerID="cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611" exitCode=0 Apr 28 19:48:35.996701 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:35.996337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" event={"ID":"f71456e7-6565-4dae-8f3d-b1eef493f050","Type":"ContainerDied","Data":"cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611"} Apr 28 19:48:37.000614 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:37.000581 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" event={"ID":"f71456e7-6565-4dae-8f3d-b1eef493f050","Type":"ContainerStarted","Data":"16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe"} Apr 28 19:48:37.027938 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:37.027895 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" podStartSLOduration=222.02788121 podStartE2EDuration="3m42.02788121s" podCreationTimestamp="2026-04-28 19:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:48:37.026429893 +0000 UTC m=+1918.376871300" watchObservedRunningTime="2026-04-28 19:48:37.02788121 +0000 UTC m=+1918.378322616" Apr 28 19:48:45.682736 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:45.682692 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:48:45.682736 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:45.682737 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:48:45.694975 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:45.694953 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:48:46.039894 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:46.039817 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:48:46.782728 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:46.782694 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7"] Apr 28 19:48:47.217078 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:48:47.217037 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-self-signed-certs: secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 28 19:48:47.217280 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:48:47.217113 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs podName:f71456e7-6565-4dae-8f3d-b1eef493f050 nodeName:}" failed. No retries permitted until 2026-04-28 19:48:47.717096677 +0000 UTC m=+1929.067538061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs") pod "scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050") : secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 28 19:48:47.721335 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:48:47.721304 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-self-signed-certs: secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 28 19:48:47.721504 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:48:47.721377 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs podName:f71456e7-6565-4dae-8f3d-b1eef493f050 nodeName:}" failed. No retries permitted until 2026-04-28 19:48:48.72136273 +0000 UTC m=+1930.071804114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs") pod "scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050") : secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 28 19:48:48.034467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.034356 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" podUID="f71456e7-6565-4dae-8f3d-b1eef493f050" containerName="main" containerID="cri-o://16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe" gracePeriod=30 Apr 28 19:48:48.272657 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.272617 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:48:48.427087 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427054 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-dshm\") pod \"f71456e7-6565-4dae-8f3d-b1eef493f050\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " Apr 28 19:48:48.427281 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427099 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs\") pod \"f71456e7-6565-4dae-8f3d-b1eef493f050\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " Apr 28 19:48:48.427281 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427145 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-model-cache\") pod \"f71456e7-6565-4dae-8f3d-b1eef493f050\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " Apr 28 19:48:48.427281 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427171 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-home\") pod \"f71456e7-6565-4dae-8f3d-b1eef493f050\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " Apr 28 19:48:48.427281 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427193 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-tmp-dir\") pod \"f71456e7-6565-4dae-8f3d-b1eef493f050\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " Apr 28 19:48:48.427512 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427452 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-model-cache" (OuterVolumeSpecName: "model-cache") pod "f71456e7-6565-4dae-8f3d-b1eef493f050" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:48:48.427512 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427465 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f71456e7-6565-4dae-8f3d-b1eef493f050" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:48:48.427512 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427512 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmhzz\" (UniqueName: \"kubernetes.io/projected/f71456e7-6565-4dae-8f3d-b1eef493f050-kube-api-access-mmhzz\") pod \"f71456e7-6565-4dae-8f3d-b1eef493f050\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " Apr 28 19:48:48.427726 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427564 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-home" (OuterVolumeSpecName: "home") pod "f71456e7-6565-4dae-8f3d-b1eef493f050" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:48:48.427883 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.427860 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-kserve-provision-location\") pod \"f71456e7-6565-4dae-8f3d-b1eef493f050\" (UID: \"f71456e7-6565-4dae-8f3d-b1eef493f050\") " Apr 28 19:48:48.428060 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.428045 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-model-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:48:48.428129 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.428066 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-home\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:48:48.428129 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.428079 2578 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-tmp-dir\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:48:48.429582 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.429558 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-dshm" (OuterVolumeSpecName: "dshm") pod "f71456e7-6565-4dae-8f3d-b1eef493f050" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:48:48.429686 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.429616 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f71456e7-6565-4dae-8f3d-b1eef493f050" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:48:48.430097 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.430074 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71456e7-6565-4dae-8f3d-b1eef493f050-kube-api-access-mmhzz" (OuterVolumeSpecName: "kube-api-access-mmhzz") pod "f71456e7-6565-4dae-8f3d-b1eef493f050" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050"). InnerVolumeSpecName "kube-api-access-mmhzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:48:48.487940 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.487900 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f71456e7-6565-4dae-8f3d-b1eef493f050" (UID: "f71456e7-6565-4dae-8f3d-b1eef493f050"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:48:48.529358 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.529324 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmhzz\" (UniqueName: \"kubernetes.io/projected/f71456e7-6565-4dae-8f3d-b1eef493f050-kube-api-access-mmhzz\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:48:48.529358 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.529353 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:48:48.529358 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.529363 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f71456e7-6565-4dae-8f3d-b1eef493f050-dshm\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:48:48.529573 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:48.529372 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f71456e7-6565-4dae-8f3d-b1eef493f050-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:48:49.038405 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.038374 2578 generic.go:358] "Generic (PLEG): container finished" podID="f71456e7-6565-4dae-8f3d-b1eef493f050" containerID="16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe" exitCode=0 Apr 28 19:48:49.038819 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.038411 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" event={"ID":"f71456e7-6565-4dae-8f3d-b1eef493f050","Type":"ContainerDied","Data":"16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe"} Apr 28 19:48:49.038819 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.038433 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" event={"ID":"f71456e7-6565-4dae-8f3d-b1eef493f050","Type":"ContainerDied","Data":"865374ad45c13354daef8e56ddebc17c2af1cad571394f71fbe7cb7d9db184a7"} Apr 28 19:48:49.038819 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.038449 2578 scope.go:117] "RemoveContainer" containerID="16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe" Apr 28 19:48:49.038819 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.038448 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7" Apr 28 19:48:49.048104 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.048083 2578 scope.go:117] "RemoveContainer" containerID="cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611" Apr 28 19:48:49.060524 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.060503 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7"] Apr 28 19:48:49.066553 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.066526 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-78659c4fb-jxgp7"] Apr 28 19:48:49.113585 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.113562 2578 scope.go:117] "RemoveContainer" containerID="16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe" Apr 28 19:48:49.113936 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:48:49.113915 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe\": container with ID starting with 16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe not found: ID does not exist" containerID="16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe" Apr 28 19:48:49.113992 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.113946 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe"} err="failed to get container status \"16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe\": rpc error: code = NotFound desc = could not find container \"16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe\": container with ID starting with 16f3aeef047b6a4ee50835fd947d6b203c9bf43e61be5457391d0e468d1d33fe not found: ID does not exist" Apr 28 19:48:49.113992 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.113965 2578 scope.go:117] "RemoveContainer" containerID="cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611" Apr 28 19:48:49.114225 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:48:49.114209 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611\": container with ID starting with cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611 not found: ID does not exist" containerID="cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611" Apr 28 19:48:49.114277 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.114232 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611"} err="failed to get container status \"cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611\": rpc error: code = NotFound desc = could not find container \"cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611\": container with ID starting with cf3da08641473d9f4d6f1d7234346c836dea8123ce422c05447c92aec3d21611 not found: ID does not exist" Apr 28 19:48:49.195844 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:48:49.195813 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71456e7-6565-4dae-8f3d-b1eef493f050" path="/var/lib/kubelet/pods/f71456e7-6565-4dae-8f3d-b1eef493f050/volumes" Apr 28 19:49:00.560764 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.560733 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2"] Apr 28 19:49:00.561111 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.561017 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f71456e7-6565-4dae-8f3d-b1eef493f050" containerName="storage-initializer" Apr 28 19:49:00.561111 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.561027 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71456e7-6565-4dae-8f3d-b1eef493f050" containerName="storage-initializer" Apr 28 19:49:00.561111 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.561037 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35878cf7-e030-4e54-95e7-2ef2ff42df25" containerName="storage-initializer" Apr 28 19:49:00.561111 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.561042 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="35878cf7-e030-4e54-95e7-2ef2ff42df25" containerName="storage-initializer" Apr 28 19:49:00.561111 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.561050 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f71456e7-6565-4dae-8f3d-b1eef493f050" containerName="main" Apr 28 19:49:00.561111 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.561055 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71456e7-6565-4dae-8f3d-b1eef493f050" containerName="main" Apr 28 19:49:00.561111 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.561111 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f71456e7-6565-4dae-8f3d-b1eef493f050" containerName="main" Apr 28 19:49:00.561314 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.561119 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="35878cf7-e030-4e54-95e7-2ef2ff42df25" containerName="storage-initializer" Apr 28 19:49:00.566373 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.566351 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.569891 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.569868 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-vz4n6\"" Apr 28 19:49:00.570023 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.569910 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:49:00.570139 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.570124 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 28 19:49:00.570218 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.570201 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:49:00.574430 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.574409 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2"] Apr 28 19:49:00.612085 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.612054 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-dshm\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.612085 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.612089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-home\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.612285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.612111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee0d805-54cd-459e-a12f-69436c9932b4-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.612285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.612133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-model-cache\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.612285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.612149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.612285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.612170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.612285 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.612194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/4ee0d805-54cd-459e-a12f-69436c9932b4-kube-api-access-gjmxc\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.712579 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.712541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/4ee0d805-54cd-459e-a12f-69436c9932b4-kube-api-access-gjmxc\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.712785 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.712609 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-dshm\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.712785 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.712665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-home\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.712884 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.712783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee0d805-54cd-459e-a12f-69436c9932b4-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.712884 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.712835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-model-cache\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.712884 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.712862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.713037 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.712967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.713131 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.713112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-home\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.713191 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.713166 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-model-cache\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.713229 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.713204 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.713314 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.713294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.715025 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.714999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-dshm\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.715395 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.715375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee0d805-54cd-459e-a12f-69436c9932b4-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.720679 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.720661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/4ee0d805-54cd-459e-a12f-69436c9932b4-kube-api-access-gjmxc\") pod \"scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.816714 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.816627 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76"] Apr 28 19:49:00.820553 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.820534 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:00.823006 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.822985 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-dxkc8\"" Apr 28 19:49:00.835485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.835465 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76"] Apr 28 19:49:00.877171 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.877141 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:49:00.915160 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.915132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:00.915295 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.915177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:00.915295 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.915217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:00.915295 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.915242 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:00.915422 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.915329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:00.915422 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.915372 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqv9p\" (UniqueName: \"kubernetes.io/projected/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kube-api-access-fqv9p\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:00.997536 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:00.997504 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2"] Apr 28 19:49:01.000649 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:49:01.000608 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee0d805_54cd_459e_a12f_69436c9932b4.slice/crio-1a4217795eff68519bc44c75295f0867b4e90b59e9ecc37d47a468e2bb82c1ce WatchSource:0}: Error finding container 1a4217795eff68519bc44c75295f0867b4e90b59e9ecc37d47a468e2bb82c1ce: Status 404 returned error can't find the container with id 1a4217795eff68519bc44c75295f0867b4e90b59e9ecc37d47a468e2bb82c1ce Apr 28 19:49:01.015975 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.015947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016067 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.015991 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqv9p\" (UniqueName: \"kubernetes.io/projected/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kube-api-access-fqv9p\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016067 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.016012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016067 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.016032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016067 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.016052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016269 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.016075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016407 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.016383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016506 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.016446 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016506 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.016495 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.016588 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.016508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.018508 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.018490 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.024353 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.024329 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqv9p\" (UniqueName: \"kubernetes.io/projected/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kube-api-access-fqv9p\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.076025 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.075965 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" event={"ID":"4ee0d805-54cd-459e-a12f-69436c9932b4","Type":"ContainerStarted","Data":"6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e"} Apr 28 19:49:01.076025 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.076002 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" event={"ID":"4ee0d805-54cd-459e-a12f-69436c9932b4","Type":"ContainerStarted","Data":"1a4217795eff68519bc44c75295f0867b4e90b59e9ecc37d47a468e2bb82c1ce"} Apr 28 19:49:01.130708 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.130680 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:01.263058 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:01.263028 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76"] Apr 28 19:49:01.265324 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:49:01.265292 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800a7fd1_2327_4c73_b6b7_f3de166f49f3.slice/crio-52075b7e3e8b381219e7d9ae62231711090749faa5d49177894330ab03c7cf48 WatchSource:0}: Error finding container 52075b7e3e8b381219e7d9ae62231711090749faa5d49177894330ab03c7cf48: Status 404 returned error can't find the container with id 52075b7e3e8b381219e7d9ae62231711090749faa5d49177894330ab03c7cf48 Apr 28 19:49:02.081389 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:02.081356 2578 generic.go:358] "Generic (PLEG): container finished" podID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerID="0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d" exitCode=0 Apr 28 19:49:02.081865 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:02.081437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" event={"ID":"800a7fd1-2327-4c73-b6b7-f3de166f49f3","Type":"ContainerDied","Data":"0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d"} Apr 28 19:49:02.081865 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:02.081479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" event={"ID":"800a7fd1-2327-4c73-b6b7-f3de166f49f3","Type":"ContainerStarted","Data":"52075b7e3e8b381219e7d9ae62231711090749faa5d49177894330ab03c7cf48"} Apr 28 19:49:04.090486 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:04.090447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" event={"ID":"800a7fd1-2327-4c73-b6b7-f3de166f49f3","Type":"ContainerStarted","Data":"41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51"} Apr 28 19:49:33.195873 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:33.195832 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:49:33.196441 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:33.196418 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" event={"ID":"800a7fd1-2327-4c73-b6b7-f3de166f49f3","Type":"ContainerStarted","Data":"da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41"} Apr 28 19:49:33.196487 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:33.196454 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:33.216248 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:33.216206 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podStartSLOduration=2.914024073 podStartE2EDuration="33.216192654s" podCreationTimestamp="2026-04-28 19:49:00 +0000 UTC" firstStartedPulling="2026-04-28 19:49:02.082881192 +0000 UTC m=+1943.433322576" lastFinishedPulling="2026-04-28 19:49:32.385049771 +0000 UTC m=+1973.735491157" observedRunningTime="2026-04-28 19:49:33.215854038 +0000 UTC m=+1974.566295446" watchObservedRunningTime="2026-04-28 19:49:33.216192654 +0000 UTC m=+1974.566634059" Apr 28 19:49:34.197867 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:34.197834 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:49:41.131817 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:41.131781 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:41.131817 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:41.131828 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:41.133438 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:41.133398 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:49:41.133559 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:41.133464 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:41.218962 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:41.218936 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:49:41.218962 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:41.218942 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:49:42.222196 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:42.222159 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:49:52.223117 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:49:52.223079 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:50:02.222688 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:02.222625 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:50:12.222363 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:12.222321 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:50:22.223160 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:22.223114 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:50:27.354784 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:27.354745 2578 generic.go:358] "Generic (PLEG): container finished" podID="4ee0d805-54cd-459e-a12f-69436c9932b4" containerID="6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e" exitCode=0 Apr 28 19:50:27.355091 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:27.354811 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" event={"ID":"4ee0d805-54cd-459e-a12f-69436c9932b4","Type":"ContainerDied","Data":"6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e"} Apr 28 19:50:27.355931 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:27.355913 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:50:28.359574 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:28.359542 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" event={"ID":"4ee0d805-54cd-459e-a12f-69436c9932b4","Type":"ContainerStarted","Data":"3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460"} Apr 28 19:50:28.380155 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:28.380108 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" podStartSLOduration=88.380094108 podStartE2EDuration="1m28.380094108s" podCreationTimestamp="2026-04-28 19:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:50:28.377793876 +0000 UTC m=+2029.728235317" watchObservedRunningTime="2026-04-28 19:50:28.380094108 +0000 UTC m=+2029.730535514" Apr 28 19:50:30.877887 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:30.877853 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:50:30.877887 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:30.877891 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:50:30.890365 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:30.890336 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:50:31.379533 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:31.379498 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:50:32.222340 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:32.222302 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:50:32.679675 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:32.679615 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2"] Apr 28 19:50:32.687336 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:32.687305 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76"] Apr 28 19:50:32.687768 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:32.687737 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" containerID="cri-o://41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51" gracePeriod=30 Apr 28 19:50:32.687994 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:32.687934 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="tokenizer" containerID="cri-o://da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41" gracePeriod=30 Apr 28 19:50:32.689462 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:32.689421 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 28 19:50:33.375769 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.375729 2578 generic.go:358] "Generic (PLEG): container finished" podID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerID="41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51" exitCode=0 Apr 28 19:50:33.376319 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.375806 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" event={"ID":"800a7fd1-2327-4c73-b6b7-f3de166f49f3","Type":"ContainerDied","Data":"41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51"} Apr 28 19:50:33.376319 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.376053 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" podUID="4ee0d805-54cd-459e-a12f-69436c9932b4" containerName="main" containerID="cri-o://3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460" gracePeriod=30 Apr 28 19:50:33.634388 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.634324 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:50:33.764518 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.764488 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee0d805-54cd-459e-a12f-69436c9932b4-tls-certs\") pod \"4ee0d805-54cd-459e-a12f-69436c9932b4\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " Apr 28 19:50:33.764691 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.764528 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-home\") pod \"4ee0d805-54cd-459e-a12f-69436c9932b4\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " Apr 28 19:50:33.764691 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.764552 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/4ee0d805-54cd-459e-a12f-69436c9932b4-kube-api-access-gjmxc\") pod \"4ee0d805-54cd-459e-a12f-69436c9932b4\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " Apr 28 19:50:33.764691 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.764569 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-dshm\") pod \"4ee0d805-54cd-459e-a12f-69436c9932b4\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " Apr 28 19:50:33.764691 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.764610 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-kserve-provision-location\") pod \"4ee0d805-54cd-459e-a12f-69436c9932b4\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " Apr 28 19:50:33.764901 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.764711 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-model-cache\") pod \"4ee0d805-54cd-459e-a12f-69436c9932b4\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " Apr 28 19:50:33.764901 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.764746 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-tmp-dir\") pod \"4ee0d805-54cd-459e-a12f-69436c9932b4\" (UID: \"4ee0d805-54cd-459e-a12f-69436c9932b4\") " Apr 28 19:50:33.764901 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.764861 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-home" (OuterVolumeSpecName: "home") pod "4ee0d805-54cd-459e-a12f-69436c9932b4" (UID: "4ee0d805-54cd-459e-a12f-69436c9932b4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.765038 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.765011 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-home\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:33.765100 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.765055 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "4ee0d805-54cd-459e-a12f-69436c9932b4" (UID: "4ee0d805-54cd-459e-a12f-69436c9932b4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.765694 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.765662 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-model-cache" (OuterVolumeSpecName: "model-cache") pod "4ee0d805-54cd-459e-a12f-69436c9932b4" (UID: "4ee0d805-54cd-459e-a12f-69436c9932b4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.767014 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.766972 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee0d805-54cd-459e-a12f-69436c9932b4-kube-api-access-gjmxc" (OuterVolumeSpecName: "kube-api-access-gjmxc") pod "4ee0d805-54cd-459e-a12f-69436c9932b4" (UID: "4ee0d805-54cd-459e-a12f-69436c9932b4"). InnerVolumeSpecName "kube-api-access-gjmxc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:50:33.767225 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.767203 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee0d805-54cd-459e-a12f-69436c9932b4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4ee0d805-54cd-459e-a12f-69436c9932b4" (UID: "4ee0d805-54cd-459e-a12f-69436c9932b4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:33.767367 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.767329 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-dshm" (OuterVolumeSpecName: "dshm") pod "4ee0d805-54cd-459e-a12f-69436c9932b4" (UID: "4ee0d805-54cd-459e-a12f-69436c9932b4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.831790 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.831753 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ee0d805-54cd-459e-a12f-69436c9932b4" (UID: "4ee0d805-54cd-459e-a12f-69436c9932b4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.866138 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.866116 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-model-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:33.866224 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.866145 2578 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-tmp-dir\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:33.866224 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.866158 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee0d805-54cd-459e-a12f-69436c9932b4-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:33.866224 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.866172 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjmxc\" (UniqueName: \"kubernetes.io/projected/4ee0d805-54cd-459e-a12f-69436c9932b4-kube-api-access-gjmxc\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:33.866224 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.866185 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-dshm\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:33.866224 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.866199 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee0d805-54cd-459e-a12f-69436c9932b4-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:33.870809 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.870790 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:50:33.966890 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.966815 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-cache\") pod \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " Apr 28 19:50:33.966890 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.966852 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tls-certs\") pod \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " Apr 28 19:50:33.966890 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.966875 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-uds\") pod \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " Apr 28 19:50:33.967127 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.966912 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-tmp\") pod \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " Apr 28 19:50:33.967127 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.966959 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kserve-provision-location\") pod \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " Apr 28 19:50:33.967127 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.967005 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqv9p\" (UniqueName: \"kubernetes.io/projected/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kube-api-access-fqv9p\") pod \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\" (UID: \"800a7fd1-2327-4c73-b6b7-f3de166f49f3\") " Apr 28 19:50:33.967275 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.967137 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "800a7fd1-2327-4c73-b6b7-f3de166f49f3" (UID: "800a7fd1-2327-4c73-b6b7-f3de166f49f3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.967275 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.967181 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "800a7fd1-2327-4c73-b6b7-f3de166f49f3" (UID: "800a7fd1-2327-4c73-b6b7-f3de166f49f3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.967381 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.967293 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "800a7fd1-2327-4c73-b6b7-f3de166f49f3" (UID: "800a7fd1-2327-4c73-b6b7-f3de166f49f3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.967714 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.967689 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "800a7fd1-2327-4c73-b6b7-f3de166f49f3" (UID: "800a7fd1-2327-4c73-b6b7-f3de166f49f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:33.969129 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.969103 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "800a7fd1-2327-4c73-b6b7-f3de166f49f3" (UID: "800a7fd1-2327-4c73-b6b7-f3de166f49f3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:33.969219 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:33.969168 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kube-api-access-fqv9p" (OuterVolumeSpecName: "kube-api-access-fqv9p") pod "800a7fd1-2327-4c73-b6b7-f3de166f49f3" (UID: "800a7fd1-2327-4c73-b6b7-f3de166f49f3"). InnerVolumeSpecName "kube-api-access-fqv9p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:50:34.068159 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.068120 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-tmp\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:34.068159 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.068153 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:34.068159 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.068163 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fqv9p\" (UniqueName: \"kubernetes.io/projected/800a7fd1-2327-4c73-b6b7-f3de166f49f3-kube-api-access-fqv9p\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:34.068159 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.068172 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:34.068423 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.068181 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:34.068423 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.068189 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/800a7fd1-2327-4c73-b6b7-f3de166f49f3-tokenizer-uds\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 19:50:34.380222 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.380186 2578 generic.go:358] "Generic (PLEG): container finished" podID="4ee0d805-54cd-459e-a12f-69436c9932b4" containerID="3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460" exitCode=0 Apr 28 19:50:34.380735 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.380270 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" Apr 28 19:50:34.380735 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.380275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" event={"ID":"4ee0d805-54cd-459e-a12f-69436c9932b4","Type":"ContainerDied","Data":"3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460"} Apr 28 19:50:34.380735 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.380323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2" event={"ID":"4ee0d805-54cd-459e-a12f-69436c9932b4","Type":"ContainerDied","Data":"1a4217795eff68519bc44c75295f0867b4e90b59e9ecc37d47a468e2bb82c1ce"} Apr 28 19:50:34.380735 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.380346 2578 scope.go:117] "RemoveContainer" containerID="3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460" Apr 28 19:50:34.382106 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.382082 2578 generic.go:358] "Generic (PLEG): container finished" podID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerID="da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41" exitCode=0 Apr 28 19:50:34.382219 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.382133 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" event={"ID":"800a7fd1-2327-4c73-b6b7-f3de166f49f3","Type":"ContainerDied","Data":"da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41"} Apr 28 19:50:34.382219 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.382142 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" Apr 28 19:50:34.382219 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.382163 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76" event={"ID":"800a7fd1-2327-4c73-b6b7-f3de166f49f3","Type":"ContainerDied","Data":"52075b7e3e8b381219e7d9ae62231711090749faa5d49177894330ab03c7cf48"} Apr 28 19:50:34.388724 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.388706 2578 scope.go:117] "RemoveContainer" containerID="6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e" Apr 28 19:50:34.405307 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.405278 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2"] Apr 28 19:50:34.408449 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.408430 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-b55c6799b-kpsr2"] Apr 28 19:50:34.418774 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.418747 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76"] Apr 28 19:50:34.422263 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.422234 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57875d45gg76"] Apr 28 19:50:34.454779 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.454760 2578 scope.go:117] "RemoveContainer" containerID="3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460" Apr 28 19:50:34.455065 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:50:34.455046 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460\": container with ID starting with 3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460 not found: ID does not exist" containerID="3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460" Apr 28 19:50:34.455139 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.455076 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460"} err="failed to get container status \"3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460\": rpc error: code = NotFound desc = could not find container \"3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460\": container with ID starting with 3922013d6e74716c43dbf7a69f027716ae714a3493a9128ce1d44fa599047460 not found: ID does not exist" Apr 28 19:50:34.455139 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.455107 2578 scope.go:117] "RemoveContainer" containerID="6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e" Apr 28 19:50:34.455409 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:50:34.455384 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e\": container with ID starting with 6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e not found: ID does not exist" containerID="6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e" Apr 28 19:50:34.455448 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.455416 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e"} err="failed to get container status \"6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e\": rpc error: code = NotFound desc = could not find container \"6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e\": container with ID starting with 6f86c7b30381f1d0a0a56f071e4cd8b4b4c53089acfd831d2caaea7b3e23ca9e not found: ID does not exist" Apr 28 19:50:34.455448 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.455432 2578 scope.go:117] "RemoveContainer" containerID="da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41" Apr 28 19:50:34.462443 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.462427 2578 scope.go:117] "RemoveContainer" containerID="41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51" Apr 28 19:50:34.469192 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.469177 2578 scope.go:117] "RemoveContainer" containerID="0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d" Apr 28 19:50:34.476047 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.476031 2578 scope.go:117] "RemoveContainer" containerID="da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41" Apr 28 19:50:34.476281 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:50:34.476264 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41\": container with ID starting with da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41 not found: ID does not exist" containerID="da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41" Apr 28 19:50:34.476340 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.476289 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41"} err="failed to get container status \"da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41\": rpc error: code = NotFound desc = could not find container \"da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41\": container with ID starting with da9c473db2d8513a5980258dd7fb34045dc34ac115e93dc6b15bf46399824d41 not found: ID does not exist" Apr 28 19:50:34.476340 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.476309 2578 scope.go:117] "RemoveContainer" containerID="41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51" Apr 28 19:50:34.476528 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:50:34.476511 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51\": container with ID starting with 41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51 not found: ID does not exist" containerID="41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51" Apr 28 19:50:34.476570 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.476534 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51"} err="failed to get container status \"41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51\": rpc error: code = NotFound desc = could not find container \"41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51\": container with ID starting with 41cfc9f4aef5c3041823ad4e63231493c80c141c127e86860ddfe9a110d54d51 not found: ID does not exist" Apr 28 19:50:34.476570 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.476549 2578 scope.go:117] "RemoveContainer" containerID="0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d" Apr 28 19:50:34.476785 ip-10-0-143-22 kubenswrapper[2578]: E0428 19:50:34.476770 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d\": container with ID starting with 0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d not found: ID does not exist" containerID="0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d" Apr 28 19:50:34.476834 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:34.476790 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d"} err="failed to get container status \"0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d\": rpc error: code = NotFound desc = could not find container \"0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d\": container with ID starting with 0740ce98f5ccd4c62ab71d23f70ed726d7cba4aff0a4a4161412b56a0337b02d not found: ID does not exist" Apr 28 19:50:35.196427 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:35.196398 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee0d805-54cd-459e-a12f-69436c9932b4" path="/var/lib/kubelet/pods/4ee0d805-54cd-459e-a12f-69436c9932b4/volumes" Apr 28 19:50:35.196846 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:35.196832 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" path="/var/lib/kubelet/pods/800a7fd1-2327-4c73-b6b7-f3de166f49f3/volumes" Apr 28 19:50:57.952023 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.951987 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7"] Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952235 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ee0d805-54cd-459e-a12f-69436c9932b4" containerName="storage-initializer" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952245 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee0d805-54cd-459e-a12f-69436c9932b4" containerName="storage-initializer" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952258 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952263 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952270 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="storage-initializer" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952275 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="storage-initializer" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952281 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="tokenizer" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952286 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="tokenizer" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952297 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ee0d805-54cd-459e-a12f-69436c9932b4" containerName="main" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952302 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee0d805-54cd-459e-a12f-69436c9932b4" containerName="main" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952343 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="tokenizer" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952352 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="800a7fd1-2327-4c73-b6b7-f3de166f49f3" containerName="main" Apr 28 19:50:57.952467 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.952357 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ee0d805-54cd-459e-a12f-69436c9932b4" containerName="main" Apr 28 19:50:57.956462 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.956442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:57.959083 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.959036 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:50:57.960122 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.960106 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-vz4n6\"" Apr 28 19:50:57.960214 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.960136 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:50:57.960214 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.960180 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 28 19:50:57.966223 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:57.966200 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7"] Apr 28 19:50:58.042018 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.041983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-model-cache\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.042173 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.042026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-tmp-dir\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.042173 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.042067 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.042173 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.042143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6338f854-88af-493c-9ecc-a9da6b422d51-tls-certs\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.042314 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.042182 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-dshm\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.042314 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.042203 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtckz\" (UniqueName: \"kubernetes.io/projected/6338f854-88af-493c-9ecc-a9da6b422d51-kube-api-access-qtckz\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.042314 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.042219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-home\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.142898 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.142872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-model-cache\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143005 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.142909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-tmp-dir\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143005 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.142927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143005 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.142958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6338f854-88af-493c-9ecc-a9da6b422d51-tls-certs\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.143005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-dshm\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.143039 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtckz\" (UniqueName: \"kubernetes.io/projected/6338f854-88af-493c-9ecc-a9da6b422d51-kube-api-access-qtckz\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143147 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.143063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-home\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143366 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.143343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-model-cache\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.143365 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.143423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-tmp-dir\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.143485 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.143446 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-home\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.145460 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.145429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-dshm\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.145712 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.145695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6338f854-88af-493c-9ecc-a9da6b422d51-tls-certs\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.151188 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.151158 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtckz\" (UniqueName: \"kubernetes.io/projected/6338f854-88af-493c-9ecc-a9da6b422d51-kube-api-access-qtckz\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-txhf7\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.217319 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.217243 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98"] Apr 28 19:50:58.220725 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.220707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.223437 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.223412 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-qvngx\"" Apr 28 19:50:58.242055 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.242012 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98"] Apr 28 19:50:58.266749 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.266722 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:50:58.345133 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.345064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9beb8938-afb4-495e-a33a-f293d96f74a2-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.345133 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.345115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.345331 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.345157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rsv7\" (UniqueName: \"kubernetes.io/projected/9beb8938-afb4-495e-a33a-f293d96f74a2-kube-api-access-6rsv7\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.345331 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.345186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.345331 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.345217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.345331 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.345275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.398896 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.398869 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7"] Apr 28 19:50:58.401323 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:50:58.401296 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6338f854_88af_493c_9ecc_a9da6b422d51.slice/crio-968e150d0077919035bd2e97f2e611d71c0b38c3d23ae230f833a8eabbef5ffb WatchSource:0}: Error finding container 968e150d0077919035bd2e97f2e611d71c0b38c3d23ae230f833a8eabbef5ffb: Status 404 returned error can't find the container with id 968e150d0077919035bd2e97f2e611d71c0b38c3d23ae230f833a8eabbef5ffb Apr 28 19:50:58.446255 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rsv7\" (UniqueName: \"kubernetes.io/projected/9beb8938-afb4-495e-a33a-f293d96f74a2-kube-api-access-6rsv7\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446370 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446370 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446370 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446551 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446376 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9beb8938-afb4-495e-a33a-f293d96f74a2-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446551 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446707 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446777 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446777 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.446880 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.446775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.448839 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.448820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9beb8938-afb4-495e-a33a-f293d96f74a2-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.454303 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.454278 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rsv7\" (UniqueName: \"kubernetes.io/projected/9beb8938-afb4-495e-a33a-f293d96f74a2-kube-api-access-6rsv7\") pod \"precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.455963 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.455938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" event={"ID":"6338f854-88af-493c-9ecc-a9da6b422d51","Type":"ContainerStarted","Data":"968e150d0077919035bd2e97f2e611d71c0b38c3d23ae230f833a8eabbef5ffb"} Apr 28 19:50:58.529270 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.529182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:50:58.656268 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:58.656234 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98"] Apr 28 19:50:58.658611 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:50:58.658582 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9beb8938_afb4_495e_a33a_f293d96f74a2.slice/crio-f3eaf5d0902729bd7b3a390a34029825ef41a9094abe95d686748bf7a68e66d2 WatchSource:0}: Error finding container f3eaf5d0902729bd7b3a390a34029825ef41a9094abe95d686748bf7a68e66d2: Status 404 returned error can't find the container with id f3eaf5d0902729bd7b3a390a34029825ef41a9094abe95d686748bf7a68e66d2 Apr 28 19:50:59.460546 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:59.460511 2578 generic.go:358] "Generic (PLEG): container finished" podID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerID="d27e03cb6a5ef26f37db59942446d25798c44493cca1d0b93756b909acffec63" exitCode=0 Apr 28 19:50:59.460998 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:59.460604 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" event={"ID":"9beb8938-afb4-495e-a33a-f293d96f74a2","Type":"ContainerDied","Data":"d27e03cb6a5ef26f37db59942446d25798c44493cca1d0b93756b909acffec63"} Apr 28 19:50:59.460998 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:59.460701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" event={"ID":"9beb8938-afb4-495e-a33a-f293d96f74a2","Type":"ContainerStarted","Data":"f3eaf5d0902729bd7b3a390a34029825ef41a9094abe95d686748bf7a68e66d2"} Apr 28 19:50:59.462138 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:50:59.462110 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" event={"ID":"6338f854-88af-493c-9ecc-a9da6b422d51","Type":"ContainerStarted","Data":"afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858"} Apr 28 19:51:00.467312 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:00.467274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" event={"ID":"9beb8938-afb4-495e-a33a-f293d96f74a2","Type":"ContainerStarted","Data":"fab9dc8194033f90dc8b06a3bb6c26718f3e8d7353e8911ecf50214d1a7f2f1f"} Apr 28 19:51:00.467693 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:00.467320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" event={"ID":"9beb8938-afb4-495e-a33a-f293d96f74a2","Type":"ContainerStarted","Data":"c699e6639e0186a2b7fd104c49a247fc379692de399ac222dc95b1c278134e47"} Apr 28 19:51:00.489990 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:00.489938 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" podStartSLOduration=2.489921646 podStartE2EDuration="2.489921646s" podCreationTimestamp="2026-04-28 19:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:51:00.487927042 +0000 UTC m=+2061.838368446" watchObservedRunningTime="2026-04-28 19:51:00.489921646 +0000 UTC m=+2061.840363051" Apr 28 19:51:01.472982 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:01.472894 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:51:02.477399 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:02.477365 2578 generic.go:358] "Generic (PLEG): container finished" podID="6338f854-88af-493c-9ecc-a9da6b422d51" containerID="afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858" exitCode=0 Apr 28 19:51:02.477869 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:02.477433 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" event={"ID":"6338f854-88af-493c-9ecc-a9da6b422d51","Type":"ContainerDied","Data":"afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858"} Apr 28 19:51:03.484884 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:03.484852 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" event={"ID":"6338f854-88af-493c-9ecc-a9da6b422d51","Type":"ContainerStarted","Data":"e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7"} Apr 28 19:51:03.509312 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:03.509239 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" podStartSLOduration=6.5092267790000005 podStartE2EDuration="6.509226779s" podCreationTimestamp="2026-04-28 19:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:51:03.507357163 +0000 UTC m=+2064.857798568" watchObservedRunningTime="2026-04-28 19:51:03.509226779 +0000 UTC m=+2064.859668182" Apr 28 19:51:08.267384 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:08.267350 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:51:08.267384 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:08.267386 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:51:08.279755 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:08.279734 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:51:08.512594 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:08.512563 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 19:51:08.529432 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:08.529353 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:51:08.529432 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:08.529382 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:51:08.530795 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:51:08.530755 2578 logging.go:55] [core] [Channel #68 SubChannel #69]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.28:9003", ServerName: "10.132.0.28:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.28:9003: connect: connection refused" Apr 28 19:51:08.532199 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:08.532175 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:51:09.505466 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:09.505438 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:51:09.529597 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:09.529528 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.28:9003\" within 1s: context deadline exceeded" Apr 28 19:51:18.529884 ip-10-0-143-22 kubenswrapper[2578]: W0428 19:51:18.529850 2578 logging.go:55] [core] [Channel #70 SubChannel #71]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.28:9003", ServerName: "10.132.0.28:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.28:9003: connect: connection refused" Apr 28 19:51:19.529989 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:19.529945 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.28:9003\" within 1s: context deadline exceeded" Apr 28 19:51:30.510107 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:30.510076 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 19:51:39.219731 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:39.219617 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:51:39.224475 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:51:39.224455 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:56:39.244458 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:56:39.244342 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 19:56:39.250110 ip-10-0-143-22 kubenswrapper[2578]: I0428 19:56:39.250090 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:01:39.265519 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:01:39.265420 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:01:39.273803 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:01:39.273781 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:05:34.870061 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:34.870024 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98"] Apr 28 20:05:34.870601 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:34.870323 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="main" containerID="cri-o://c699e6639e0186a2b7fd104c49a247fc379692de399ac222dc95b1c278134e47" gracePeriod=30 Apr 28 20:05:34.870601 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:34.870397 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="tokenizer" containerID="cri-o://fab9dc8194033f90dc8b06a3bb6c26718f3e8d7353e8911ecf50214d1a7f2f1f" gracePeriod=30 Apr 28 20:05:34.878163 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:34.878138 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7"] Apr 28 20:05:34.878444 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:34.878417 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" podUID="6338f854-88af-493c-9ecc-a9da6b422d51" containerName="main" containerID="cri-o://e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7" gracePeriod=30 Apr 28 20:05:35.121456 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.121402 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 20:05:35.167532 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167505 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6338f854-88af-493c-9ecc-a9da6b422d51-tls-certs\") pod \"6338f854-88af-493c-9ecc-a9da6b422d51\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " Apr 28 20:05:35.167764 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167539 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-model-cache\") pod \"6338f854-88af-493c-9ecc-a9da6b422d51\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " Apr 28 20:05:35.167764 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167563 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-home\") pod \"6338f854-88af-493c-9ecc-a9da6b422d51\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " Apr 28 20:05:35.167764 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167599 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtckz\" (UniqueName: \"kubernetes.io/projected/6338f854-88af-493c-9ecc-a9da6b422d51-kube-api-access-qtckz\") pod \"6338f854-88af-493c-9ecc-a9da6b422d51\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " Apr 28 20:05:35.167764 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167626 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-tmp-dir\") pod \"6338f854-88af-493c-9ecc-a9da6b422d51\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " Apr 28 20:05:35.167764 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167669 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-kserve-provision-location\") pod \"6338f854-88af-493c-9ecc-a9da6b422d51\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " Apr 28 20:05:35.167764 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167705 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-dshm\") pod \"6338f854-88af-493c-9ecc-a9da6b422d51\" (UID: \"6338f854-88af-493c-9ecc-a9da6b422d51\") " Apr 28 20:05:35.168086 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167828 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-model-cache" (OuterVolumeSpecName: "model-cache") pod "6338f854-88af-493c-9ecc-a9da6b422d51" (UID: "6338f854-88af-493c-9ecc-a9da6b422d51"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:35.168086 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.167932 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-model-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:35.168185 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.168149 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6338f854-88af-493c-9ecc-a9da6b422d51" (UID: "6338f854-88af-493c-9ecc-a9da6b422d51"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:35.168571 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.168546 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-home" (OuterVolumeSpecName: "home") pod "6338f854-88af-493c-9ecc-a9da6b422d51" (UID: "6338f854-88af-493c-9ecc-a9da6b422d51"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:35.170810 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.170782 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-dshm" (OuterVolumeSpecName: "dshm") pod "6338f854-88af-493c-9ecc-a9da6b422d51" (UID: "6338f854-88af-493c-9ecc-a9da6b422d51"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:35.170962 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.170885 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6338f854-88af-493c-9ecc-a9da6b422d51-kube-api-access-qtckz" (OuterVolumeSpecName: "kube-api-access-qtckz") pod "6338f854-88af-493c-9ecc-a9da6b422d51" (UID: "6338f854-88af-493c-9ecc-a9da6b422d51"). InnerVolumeSpecName "kube-api-access-qtckz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:05:35.171135 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.171106 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6338f854-88af-493c-9ecc-a9da6b422d51-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6338f854-88af-493c-9ecc-a9da6b422d51" (UID: "6338f854-88af-493c-9ecc-a9da6b422d51"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:05:35.172866 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.172839 2578 generic.go:358] "Generic (PLEG): container finished" podID="6338f854-88af-493c-9ecc-a9da6b422d51" containerID="e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7" exitCode=0 Apr 28 20:05:35.172992 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.172975 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" Apr 28 20:05:35.173558 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.172973 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" event={"ID":"6338f854-88af-493c-9ecc-a9da6b422d51","Type":"ContainerDied","Data":"e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7"} Apr 28 20:05:35.173558 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.173112 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7" event={"ID":"6338f854-88af-493c-9ecc-a9da6b422d51","Type":"ContainerDied","Data":"968e150d0077919035bd2e97f2e611d71c0b38c3d23ae230f833a8eabbef5ffb"} Apr 28 20:05:35.173558 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.173139 2578 scope.go:117] "RemoveContainer" containerID="e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7" Apr 28 20:05:35.175248 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.175228 2578 generic.go:358] "Generic (PLEG): container finished" podID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerID="c699e6639e0186a2b7fd104c49a247fc379692de399ac222dc95b1c278134e47" exitCode=0 Apr 28 20:05:35.175346 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.175273 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" event={"ID":"9beb8938-afb4-495e-a33a-f293d96f74a2","Type":"ContainerDied","Data":"c699e6639e0186a2b7fd104c49a247fc379692de399ac222dc95b1c278134e47"} Apr 28 20:05:35.184514 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.184490 2578 scope.go:117] "RemoveContainer" containerID="afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858" Apr 28 20:05:35.234625 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.234581 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6338f854-88af-493c-9ecc-a9da6b422d51" (UID: "6338f854-88af-493c-9ecc-a9da6b422d51"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:35.251773 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.251748 2578 scope.go:117] "RemoveContainer" containerID="e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7" Apr 28 20:05:35.252101 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:05:35.252072 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7\": container with ID starting with e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7 not found: ID does not exist" containerID="e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7" Apr 28 20:05:35.252180 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.252112 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7"} err="failed to get container status \"e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7\": rpc error: code = NotFound desc = could not find container \"e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7\": container with ID starting with e4be8af68c3ec7a115b477888abc5d91950eea84d18c8e8afbaea799c8cb5aa7 not found: ID does not exist" Apr 28 20:05:35.252180 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.252132 2578 scope.go:117] "RemoveContainer" containerID="afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858" Apr 28 20:05:35.252396 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:05:35.252380 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858\": container with ID starting with afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858 not found: ID does not exist" containerID="afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858" Apr 28 20:05:35.252442 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.252401 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858"} err="failed to get container status \"afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858\": rpc error: code = NotFound desc = could not find container \"afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858\": container with ID starting with afe5f37392a3cc864a99a831d8df023e283ecedceb0d204f02006482312ab858 not found: ID does not exist" Apr 28 20:05:35.268978 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.268951 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtckz\" (UniqueName: \"kubernetes.io/projected/6338f854-88af-493c-9ecc-a9da6b422d51-kube-api-access-qtckz\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:35.268978 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.268975 2578 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-tmp-dir\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:35.269085 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.268987 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:35.269085 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.269000 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-dshm\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:35.269085 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.269008 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6338f854-88af-493c-9ecc-a9da6b422d51-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:35.269085 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.269017 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6338f854-88af-493c-9ecc-a9da6b422d51-home\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:35.495793 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.495764 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7"] Apr 28 20:05:35.499353 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:35.499327 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-txhf7"] Apr 28 20:05:36.181355 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.181322 2578 generic.go:358] "Generic (PLEG): container finished" podID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerID="fab9dc8194033f90dc8b06a3bb6c26718f3e8d7353e8911ecf50214d1a7f2f1f" exitCode=0 Apr 28 20:05:36.181815 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.181365 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" event={"ID":"9beb8938-afb4-495e-a33a-f293d96f74a2","Type":"ContainerDied","Data":"fab9dc8194033f90dc8b06a3bb6c26718f3e8d7353e8911ecf50214d1a7f2f1f"} Apr 28 20:05:36.212301 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.212279 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 20:05:36.276912 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.276875 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-cache\") pod \"9beb8938-afb4-495e-a33a-f293d96f74a2\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " Apr 28 20:05:36.276912 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.276913 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9beb8938-afb4-495e-a33a-f293d96f74a2-tls-certs\") pod \"9beb8938-afb4-495e-a33a-f293d96f74a2\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " Apr 28 20:05:36.277126 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.276936 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rsv7\" (UniqueName: \"kubernetes.io/projected/9beb8938-afb4-495e-a33a-f293d96f74a2-kube-api-access-6rsv7\") pod \"9beb8938-afb4-495e-a33a-f293d96f74a2\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " Apr 28 20:05:36.277126 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.276959 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-tmp\") pod \"9beb8938-afb4-495e-a33a-f293d96f74a2\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " Apr 28 20:05:36.277126 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.276996 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-kserve-provision-location\") pod \"9beb8938-afb4-495e-a33a-f293d96f74a2\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " Apr 28 20:05:36.277126 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.277045 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-uds\") pod \"9beb8938-afb4-495e-a33a-f293d96f74a2\" (UID: \"9beb8938-afb4-495e-a33a-f293d96f74a2\") " Apr 28 20:05:36.277392 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.277311 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9beb8938-afb4-495e-a33a-f293d96f74a2" (UID: "9beb8938-afb4-495e-a33a-f293d96f74a2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:36.277452 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.277389 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9beb8938-afb4-495e-a33a-f293d96f74a2" (UID: "9beb8938-afb4-495e-a33a-f293d96f74a2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:36.277452 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.277326 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9beb8938-afb4-495e-a33a-f293d96f74a2" (UID: "9beb8938-afb4-495e-a33a-f293d96f74a2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:36.277789 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.277765 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9beb8938-afb4-495e-a33a-f293d96f74a2" (UID: "9beb8938-afb4-495e-a33a-f293d96f74a2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:36.279196 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.279175 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9beb8938-afb4-495e-a33a-f293d96f74a2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9beb8938-afb4-495e-a33a-f293d96f74a2" (UID: "9beb8938-afb4-495e-a33a-f293d96f74a2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:05:36.279297 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.279279 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9beb8938-afb4-495e-a33a-f293d96f74a2-kube-api-access-6rsv7" (OuterVolumeSpecName: "kube-api-access-6rsv7") pod "9beb8938-afb4-495e-a33a-f293d96f74a2" (UID: "9beb8938-afb4-495e-a33a-f293d96f74a2"). InnerVolumeSpecName "kube-api-access-6rsv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:05:36.377985 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.377951 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-uds\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:36.377985 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.377978 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:36.378169 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.377995 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9beb8938-afb4-495e-a33a-f293d96f74a2-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:36.378169 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.378007 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rsv7\" (UniqueName: \"kubernetes.io/projected/9beb8938-afb4-495e-a33a-f293d96f74a2-kube-api-access-6rsv7\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:36.378169 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.378016 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-tokenizer-tmp\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:36.378169 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:36.378024 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9beb8938-afb4-495e-a33a-f293d96f74a2-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:05:37.185839 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:37.185795 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" event={"ID":"9beb8938-afb4-495e-a33a-f293d96f74a2","Type":"ContainerDied","Data":"f3eaf5d0902729bd7b3a390a34029825ef41a9094abe95d686748bf7a68e66d2"} Apr 28 20:05:37.185839 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:37.185846 2578 scope.go:117] "RemoveContainer" containerID="fab9dc8194033f90dc8b06a3bb6c26718f3e8d7353e8911ecf50214d1a7f2f1f" Apr 28 20:05:37.186292 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:37.185862 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98" Apr 28 20:05:37.194314 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:37.194289 2578 scope.go:117] "RemoveContainer" containerID="c699e6639e0186a2b7fd104c49a247fc379692de399ac222dc95b1c278134e47" Apr 28 20:05:37.195682 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:37.195662 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6338f854-88af-493c-9ecc-a9da6b422d51" path="/var/lib/kubelet/pods/6338f854-88af-493c-9ecc-a9da6b422d51/volumes" Apr 28 20:05:37.201671 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:37.201653 2578 scope.go:117] "RemoveContainer" containerID="d27e03cb6a5ef26f37db59942446d25798c44493cca1d0b93756b909acffec63" Apr 28 20:05:37.208342 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:37.208319 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98"] Apr 28 20:05:37.212102 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:37.212080 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-885b9d6ctwj98"] Apr 28 20:05:39.194928 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:39.194888 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" path="/var/lib/kubelet/pods/9beb8938-afb4-495e-a33a-f293d96f74a2/volumes" Apr 28 20:05:55.391812 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.391776 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b"] Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392060 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6338f854-88af-493c-9ecc-a9da6b422d51" containerName="main" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392071 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6338f854-88af-493c-9ecc-a9da6b422d51" containerName="main" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392081 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6338f854-88af-493c-9ecc-a9da6b422d51" containerName="storage-initializer" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392087 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6338f854-88af-493c-9ecc-a9da6b422d51" containerName="storage-initializer" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392097 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="tokenizer" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392105 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="tokenizer" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392117 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="storage-initializer" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392122 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="storage-initializer" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392128 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="main" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392132 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="main" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392190 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="tokenizer" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392200 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6338f854-88af-493c-9ecc-a9da6b422d51" containerName="main" Apr 28 20:05:55.392249 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.392205 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9beb8938-afb4-495e-a33a-f293d96f74a2" containerName="main" Apr 28 20:05:55.394051 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.394029 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.396779 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.396741 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:05:55.397704 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.397681 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-j9q92\"" Apr 28 20:05:55.397704 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.397697 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:05:55.397866 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.397681 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-vz4n6\"" Apr 28 20:05:55.398273 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.398248 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 28 20:05:55.410677 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.410651 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b"] Apr 28 20:05:55.414963 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.414943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.415105 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.414986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.415105 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.415007 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.415105 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.415033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5cff\" (UniqueName: \"kubernetes.io/projected/fe4d5257-883e-411a-9561-a398809e611d-kube-api-access-t5cff\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.415267 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.415098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.415267 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.415178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516198 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516360 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516360 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516222 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5cff\" (UniqueName: \"kubernetes.io/projected/fe4d5257-883e-411a-9561-a398809e611d-kube-api-access-t5cff\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516360 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516360 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516573 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516699 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516750 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516750 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.516822 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.516767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.518873 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.518856 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.524810 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.524787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5cff\" (UniqueName: \"kubernetes.io/projected/fe4d5257-883e-411a-9561-a398809e611d-kube-api-access-t5cff\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.704540 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.704428 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:55.830986 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.830968 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b"] Apr 28 20:05:55.833811 ip-10-0-143-22 kubenswrapper[2578]: W0428 20:05:55.833779 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe4d5257_883e_411a_9561_a398809e611d.slice/crio-327434eeb6107a4aa47e0897b319f9317514a318eb55d19a44fa04d43385767d WatchSource:0}: Error finding container 327434eeb6107a4aa47e0897b319f9317514a318eb55d19a44fa04d43385767d: Status 404 returned error can't find the container with id 327434eeb6107a4aa47e0897b319f9317514a318eb55d19a44fa04d43385767d Apr 28 20:05:55.836014 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:55.835997 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:05:56.243213 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:56.243170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" event={"ID":"fe4d5257-883e-411a-9561-a398809e611d","Type":"ContainerStarted","Data":"db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13"} Apr 28 20:05:56.243213 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:56.243218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" event={"ID":"fe4d5257-883e-411a-9561-a398809e611d","Type":"ContainerStarted","Data":"327434eeb6107a4aa47e0897b319f9317514a318eb55d19a44fa04d43385767d"} Apr 28 20:05:57.247975 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:57.247934 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe4d5257-883e-411a-9561-a398809e611d" containerID="db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13" exitCode=0 Apr 28 20:05:57.248322 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:57.248013 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" event={"ID":"fe4d5257-883e-411a-9561-a398809e611d","Type":"ContainerDied","Data":"db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13"} Apr 28 20:05:58.253086 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:58.253052 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" event={"ID":"fe4d5257-883e-411a-9561-a398809e611d","Type":"ContainerStarted","Data":"8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62"} Apr 28 20:05:58.253086 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:58.253088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" event={"ID":"fe4d5257-883e-411a-9561-a398809e611d","Type":"ContainerStarted","Data":"d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380"} Apr 28 20:05:58.253514 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:58.253180 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:05:58.275278 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:05:58.275223 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" podStartSLOduration=3.275206369 podStartE2EDuration="3.275206369s" podCreationTimestamp="2026-04-28 20:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:05:58.273613473 +0000 UTC m=+2959.624054893" watchObservedRunningTime="2026-04-28 20:05:58.275206369 +0000 UTC m=+2959.625647776" Apr 28 20:06:05.704709 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:06:05.704675 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:06:05.705139 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:06:05.704720 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:06:05.707425 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:06:05.707401 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:06:06.278184 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:06:06.278142 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:06:27.282165 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:06:27.282132 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:06:39.285064 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:06:39.284950 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:06:39.293649 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:06:39.293612 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:08:39.193523 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:39.193485 2578 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" secret="" err="secret \"stop-feature-test-epp-sa-dockercfg-j9q92\" not found" Apr 28 20:08:39.236020 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:39.235995 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:39.236157 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:39.236065 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs podName:fe4d5257-883e-411a-9561-a398809e611d nodeName:}" failed. No retries permitted until 2026-04-28 20:08:39.736048325 +0000 UTC m=+3121.086489708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs") pod "stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" (UID: "fe4d5257-883e-411a-9561-a398809e611d") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:39.739729 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:39.739699 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:39.739889 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:39.739755 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs podName:fe4d5257-883e-411a-9561-a398809e611d nodeName:}" failed. No retries permitted until 2026-04-28 20:08:40.73974252 +0000 UTC m=+3122.090183903 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs") pod "stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" (UID: "fe4d5257-883e-411a-9561-a398809e611d") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:40.748179 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:40.748149 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:40.748533 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:40.748215 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs podName:fe4d5257-883e-411a-9561-a398809e611d nodeName:}" failed. No retries permitted until 2026-04-28 20:08:42.748197214 +0000 UTC m=+3124.098638615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs") pod "stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" (UID: "fe4d5257-883e-411a-9561-a398809e611d") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:42.761232 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:42.761202 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:42.761603 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:42.761270 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs podName:fe4d5257-883e-411a-9561-a398809e611d nodeName:}" failed. No retries permitted until 2026-04-28 20:08:46.761253283 +0000 UTC m=+3128.111694669 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs") pod "stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" (UID: "fe4d5257-883e-411a-9561-a398809e611d") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:46.799538 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:46.799506 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:46.799964 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:46.799581 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs podName:fe4d5257-883e-411a-9561-a398809e611d nodeName:}" failed. No retries permitted until 2026-04-28 20:08:54.799565753 +0000 UTC m=+3136.150007138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs") pod "stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" (UID: "fe4d5257-883e-411a-9561-a398809e611d") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 28 20:08:46.832142 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:46.832111 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b"] Apr 28 20:08:46.832566 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:46.832534 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="main" containerID="cri-o://d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380" gracePeriod=30 Apr 28 20:08:46.832737 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:46.832607 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="tokenizer" containerID="cri-o://8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62" gracePeriod=30 Apr 28 20:08:47.281007 ip-10-0-143-22 kubenswrapper[2578]: W0428 20:08:47.280973 2578 logging.go:55] [core] [Channel #675 SubChannel #676]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.29:9003", ServerName: "10.132.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.29:9003: connect: connection refused" Apr 28 20:08:47.758626 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:47.758590 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe4d5257-883e-411a-9561-a398809e611d" containerID="d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380" exitCode=0 Apr 28 20:08:47.758840 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:47.758669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" event={"ID":"fe4d5257-883e-411a-9561-a398809e611d","Type":"ContainerDied","Data":"d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380"} Apr 28 20:08:47.984428 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:47.984406 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:08:48.112572 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.112542 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-cache\") pod \"fe4d5257-883e-411a-9561-a398809e611d\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " Apr 28 20:08:48.112773 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.112588 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs\") pod \"fe4d5257-883e-411a-9561-a398809e611d\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " Apr 28 20:08:48.112773 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.112668 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5cff\" (UniqueName: \"kubernetes.io/projected/fe4d5257-883e-411a-9561-a398809e611d-kube-api-access-t5cff\") pod \"fe4d5257-883e-411a-9561-a398809e611d\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " Apr 28 20:08:48.112773 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.112706 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-kserve-provision-location\") pod \"fe4d5257-883e-411a-9561-a398809e611d\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " Apr 28 20:08:48.112773 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.112732 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-tmp\") pod \"fe4d5257-883e-411a-9561-a398809e611d\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " Apr 28 20:08:48.112773 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.112762 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-uds\") pod \"fe4d5257-883e-411a-9561-a398809e611d\" (UID: \"fe4d5257-883e-411a-9561-a398809e611d\") " Apr 28 20:08:48.112991 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.112865 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fe4d5257-883e-411a-9561-a398809e611d" (UID: "fe4d5257-883e-411a-9561-a398809e611d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:48.112991 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.112972 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:08:48.113166 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.113131 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fe4d5257-883e-411a-9561-a398809e611d" (UID: "fe4d5257-883e-411a-9561-a398809e611d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:48.113284 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.113266 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fe4d5257-883e-411a-9561-a398809e611d" (UID: "fe4d5257-883e-411a-9561-a398809e611d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:48.113524 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.113504 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fe4d5257-883e-411a-9561-a398809e611d" (UID: "fe4d5257-883e-411a-9561-a398809e611d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:48.114908 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.114884 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fe4d5257-883e-411a-9561-a398809e611d" (UID: "fe4d5257-883e-411a-9561-a398809e611d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:08:48.115013 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.114982 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4d5257-883e-411a-9561-a398809e611d-kube-api-access-t5cff" (OuterVolumeSpecName: "kube-api-access-t5cff") pod "fe4d5257-883e-411a-9561-a398809e611d" (UID: "fe4d5257-883e-411a-9561-a398809e611d"). InnerVolumeSpecName "kube-api-access-t5cff". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:08:48.214290 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.214259 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-uds\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:08:48.214290 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.214287 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe4d5257-883e-411a-9561-a398809e611d-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:08:48.214444 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.214296 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t5cff\" (UniqueName: \"kubernetes.io/projected/fe4d5257-883e-411a-9561-a398809e611d-kube-api-access-t5cff\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:08:48.214444 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.214307 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:08:48.214444 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.214317 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe4d5257-883e-411a-9561-a398809e611d-tokenizer-tmp\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:08:48.281302 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.281269 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.29:9003\" within 1s: context deadline exceeded" Apr 28 20:08:48.763202 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.763168 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe4d5257-883e-411a-9561-a398809e611d" containerID="8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62" exitCode=0 Apr 28 20:08:48.763358 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.763245 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" Apr 28 20:08:48.763358 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.763249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" event={"ID":"fe4d5257-883e-411a-9561-a398809e611d","Type":"ContainerDied","Data":"8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62"} Apr 28 20:08:48.763358 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.763284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b" event={"ID":"fe4d5257-883e-411a-9561-a398809e611d","Type":"ContainerDied","Data":"327434eeb6107a4aa47e0897b319f9317514a318eb55d19a44fa04d43385767d"} Apr 28 20:08:48.763358 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.763298 2578 scope.go:117] "RemoveContainer" containerID="8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62" Apr 28 20:08:48.771552 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.771536 2578 scope.go:117] "RemoveContainer" containerID="d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380" Apr 28 20:08:48.778277 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.778259 2578 scope.go:117] "RemoveContainer" containerID="db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13" Apr 28 20:08:48.784025 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.784002 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b"] Apr 28 20:08:48.785765 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.785747 2578 scope.go:117] "RemoveContainer" containerID="8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62" Apr 28 20:08:48.786327 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:48.786284 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62\": container with ID starting with 8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62 not found: ID does not exist" containerID="8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62" Apr 28 20:08:48.786418 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.786319 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62"} err="failed to get container status \"8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62\": rpc error: code = NotFound desc = could not find container \"8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62\": container with ID starting with 8f8733de1f57af678e5dee8d42cf14456ef7fd7f14b3827e511cc6dcecbfdc62 not found: ID does not exist" Apr 28 20:08:48.786418 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.786348 2578 scope.go:117] "RemoveContainer" containerID="d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380" Apr 28 20:08:48.786703 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:48.786679 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380\": container with ID starting with d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380 not found: ID does not exist" containerID="d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380" Apr 28 20:08:48.786782 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.786713 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380"} err="failed to get container status \"d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380\": rpc error: code = NotFound desc = could not find container \"d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380\": container with ID starting with d3c1c0b3a79c2fc5784c5ae2b542d8457b3215b343a646f637b06b6a80ef9380 not found: ID does not exist" Apr 28 20:08:48.786782 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.786735 2578 scope.go:117] "RemoveContainer" containerID="db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13" Apr 28 20:08:48.787044 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:08:48.787025 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13\": container with ID starting with db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13 not found: ID does not exist" containerID="db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13" Apr 28 20:08:48.787142 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.787050 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13"} err="failed to get container status \"db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13\": rpc error: code = NotFound desc = could not find container \"db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13\": container with ID starting with db2f0a50a3dc028f8dcb10508c5fbd19342e85805d5a0ff4f07e51721d367b13 not found: ID does not exist" Apr 28 20:08:48.787595 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:48.787580 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-9m42b"] Apr 28 20:08:49.195425 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:49.195392 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4d5257-883e-411a-9561-a398809e611d" path="/var/lib/kubelet/pods/fe4d5257-883e-411a-9561-a398809e611d/volumes" Apr 28 20:08:54.310058 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310008 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6"] Apr 28 20:08:54.310472 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310281 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="main" Apr 28 20:08:54.310472 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310292 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="main" Apr 28 20:08:54.310472 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310311 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="storage-initializer" Apr 28 20:08:54.310472 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310317 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="storage-initializer" Apr 28 20:08:54.310472 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310324 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="tokenizer" Apr 28 20:08:54.310472 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310330 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="tokenizer" Apr 28 20:08:54.310472 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310375 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="tokenizer" Apr 28 20:08:54.310472 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.310382 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe4d5257-883e-411a-9561-a398809e611d" containerName="main" Apr 28 20:08:54.314932 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.314915 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.318485 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.318463 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-vz4n6\"" Apr 28 20:08:54.319155 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.319136 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 28 20:08:54.319228 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.319139 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:08:54.319375 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.319363 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:08:54.319427 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.319410 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-kxgrg\"" Apr 28 20:08:54.326888 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.326868 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6"] Apr 28 20:08:54.362792 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.362769 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.362885 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.362799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.362885 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.362834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.362963 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.362916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrhj6\" (UniqueName: \"kubernetes.io/projected/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-kube-api-access-zrhj6\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.362963 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.362946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.363042 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.362968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.463777 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.463748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.463875 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.463787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrhj6\" (UniqueName: \"kubernetes.io/projected/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-kube-api-access-zrhj6\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.463875 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.463807 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.463875 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.463825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.463990 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.463944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.464032 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.463987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.464153 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.464137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.464206 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.464178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.464241 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.464205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.464279 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.464264 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.466657 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.466621 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.472790 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.472770 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrhj6\" (UniqueName: \"kubernetes.io/projected/c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd-kube-api-access-zrhj6\") pod \"stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6\" (UID: \"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.623942 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.623911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:54.746177 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.746143 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6"] Apr 28 20:08:54.748969 ip-10-0-143-22 kubenswrapper[2578]: W0428 20:08:54.748940 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0741a9a_c65a_4b1c_a03c_f366d5f0fbcd.slice/crio-da023996fdb5a6ab2419185fd805746a351d9615e783efc34874fd7644c37e3c WatchSource:0}: Error finding container da023996fdb5a6ab2419185fd805746a351d9615e783efc34874fd7644c37e3c: Status 404 returned error can't find the container with id da023996fdb5a6ab2419185fd805746a351d9615e783efc34874fd7644c37e3c Apr 28 20:08:54.787523 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:54.787493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" event={"ID":"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd","Type":"ContainerStarted","Data":"da023996fdb5a6ab2419185fd805746a351d9615e783efc34874fd7644c37e3c"} Apr 28 20:08:55.791475 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:55.791360 2578 generic.go:358] "Generic (PLEG): container finished" podID="c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd" containerID="b0ec4349228934e39db9b41288df4f7d6a515baa247979bcf47215cb232d6b0f" exitCode=0 Apr 28 20:08:55.791475 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:55.791419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" event={"ID":"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd","Type":"ContainerDied","Data":"b0ec4349228934e39db9b41288df4f7d6a515baa247979bcf47215cb232d6b0f"} Apr 28 20:08:56.797382 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:56.797347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" event={"ID":"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd","Type":"ContainerStarted","Data":"c936ef46954d33188a3c8d0cded6745ce5236e7372f124f92ca3439a4ff4b2f7"} Apr 28 20:08:56.797382 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:56.797384 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" event={"ID":"c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd","Type":"ContainerStarted","Data":"c51be8154fcd5419fa81fe968a8edcb2308d37d07a92a61d052aba6f39d94144"} Apr 28 20:08:56.797822 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:56.797505 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:08:56.821692 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:08:56.821621 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" podStartSLOduration=2.821596218 podStartE2EDuration="2.821596218s" podCreationTimestamp="2026-04-28 20:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:08:56.820445352 +0000 UTC m=+3138.170886760" watchObservedRunningTime="2026-04-28 20:08:56.821596218 +0000 UTC m=+3138.172037627" Apr 28 20:09:04.625056 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:09:04.625011 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:09:04.625056 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:09:04.625063 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:09:04.627595 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:09:04.627565 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:09:04.825301 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:09:04.825274 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:09:25.828016 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:09:25.827987 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6" Apr 28 20:11:39.308347 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:11:39.308234 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:11:39.318207 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:11:39.318190 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:16:39.328173 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:16:39.328067 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:16:39.338487 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:16:39.338468 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:21:39.346683 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:21:39.346567 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:21:39.358002 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:21:39.357983 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:23:41.494510 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.494474 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht"] Apr 28 20:23:41.497487 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.497471 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:41.501116 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.501054 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 28 20:23:41.501116 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.501073 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-hbt5m\"" Apr 28 20:23:41.505876 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.505852 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht"] Apr 28 20:23:41.591856 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.591824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60e7244c-cb78-487d-af6a-8c33378b5d1c-cert\") pod \"llmisvc-controller-manager-7c5d89bcd8-hxpht\" (UID: \"60e7244c-cb78-487d-af6a-8c33378b5d1c\") " pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:41.591983 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.591860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5snb\" (UniqueName: \"kubernetes.io/projected/60e7244c-cb78-487d-af6a-8c33378b5d1c-kube-api-access-v5snb\") pod \"llmisvc-controller-manager-7c5d89bcd8-hxpht\" (UID: \"60e7244c-cb78-487d-af6a-8c33378b5d1c\") " pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:41.692835 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.692805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60e7244c-cb78-487d-af6a-8c33378b5d1c-cert\") pod \"llmisvc-controller-manager-7c5d89bcd8-hxpht\" (UID: \"60e7244c-cb78-487d-af6a-8c33378b5d1c\") " pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:41.692984 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.692853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5snb\" (UniqueName: \"kubernetes.io/projected/60e7244c-cb78-487d-af6a-8c33378b5d1c-kube-api-access-v5snb\") pod \"llmisvc-controller-manager-7c5d89bcd8-hxpht\" (UID: \"60e7244c-cb78-487d-af6a-8c33378b5d1c\") " pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:41.695263 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.695240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60e7244c-cb78-487d-af6a-8c33378b5d1c-cert\") pod \"llmisvc-controller-manager-7c5d89bcd8-hxpht\" (UID: \"60e7244c-cb78-487d-af6a-8c33378b5d1c\") " pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:41.700415 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.700394 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5snb\" (UniqueName: \"kubernetes.io/projected/60e7244c-cb78-487d-af6a-8c33378b5d1c-kube-api-access-v5snb\") pod \"llmisvc-controller-manager-7c5d89bcd8-hxpht\" (UID: \"60e7244c-cb78-487d-af6a-8c33378b5d1c\") " pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:41.808735 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.808668 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:41.934316 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.934243 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht"] Apr 28 20:23:41.936539 ip-10-0-143-22 kubenswrapper[2578]: W0428 20:23:41.936511 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod60e7244c_cb78_487d_af6a_8c33378b5d1c.slice/crio-10133cd13a36b714fb86704115050de7fa9ce885e46bfc14df3368e2c00d10ee WatchSource:0}: Error finding container 10133cd13a36b714fb86704115050de7fa9ce885e46bfc14df3368e2c00d10ee: Status 404 returned error can't find the container with id 10133cd13a36b714fb86704115050de7fa9ce885e46bfc14df3368e2c00d10ee Apr 28 20:23:41.937822 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:41.937803 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:23:42.487057 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:42.487021 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" event={"ID":"60e7244c-cb78-487d-af6a-8c33378b5d1c","Type":"ContainerStarted","Data":"10133cd13a36b714fb86704115050de7fa9ce885e46bfc14df3368e2c00d10ee"} Apr 28 20:23:46.502074 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:46.502039 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" event={"ID":"60e7244c-cb78-487d-af6a-8c33378b5d1c","Type":"ContainerStarted","Data":"d405dd77f7714a906b7603eb511bd622bdb7ed00699253ec89e47becd4c108d3"} Apr 28 20:23:46.502456 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:46.502095 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:23:46.518798 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:23:46.518753 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" podStartSLOduration=1.4747938139999999 podStartE2EDuration="5.51873985s" podCreationTimestamp="2026-04-28 20:23:41 +0000 UTC" firstStartedPulling="2026-04-28 20:23:41.937969222 +0000 UTC m=+4023.288410606" lastFinishedPulling="2026-04-28 20:23:45.981915254 +0000 UTC m=+4027.332356642" observedRunningTime="2026-04-28 20:23:46.517485715 +0000 UTC m=+4027.867927122" watchObservedRunningTime="2026-04-28 20:23:46.51873985 +0000 UTC m=+4027.869181255" Apr 28 20:24:17.506844 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:17.506774 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-7c5d89bcd8-hxpht" Apr 28 20:24:54.385845 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.385813 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb"] Apr 28 20:24:54.389211 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.389194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.391467 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.391446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 28 20:24:54.391862 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.391837 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-7tvfq\"" Apr 28 20:24:54.398330 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.398312 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb"] Apr 28 20:24:54.434401 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.434372 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.434520 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.434434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.434520 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.434456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.434520 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.434473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.434520 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.434489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d22b0dd2-788c-4dda-871c-7be54f9f8178-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.434681 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.434536 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljvr\" (UniqueName: \"kubernetes.io/projected/d22b0dd2-788c-4dda-871c-7be54f9f8178-kube-api-access-7ljvr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.534874 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.534841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535038 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.534904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535038 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.534926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535038 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.534944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535038 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.534960 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d22b0dd2-788c-4dda-871c-7be54f9f8178-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535038 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.534975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljvr\" (UniqueName: \"kubernetes.io/projected/d22b0dd2-788c-4dda-871c-7be54f9f8178-kube-api-access-7ljvr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535336 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.535312 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535396 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.535340 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535433 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.535415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.535487 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.535468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.537702 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.537681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d22b0dd2-788c-4dda-871c-7be54f9f8178-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.542779 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.542758 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljvr\" (UniqueName: \"kubernetes.io/projected/d22b0dd2-788c-4dda-871c-7be54f9f8178-kube-api-access-7ljvr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.700655 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.700565 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:54.831890 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:54.831861 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb"] Apr 28 20:24:54.834335 ip-10-0-143-22 kubenswrapper[2578]: W0428 20:24:54.834304 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd22b0dd2_788c_4dda_871c_7be54f9f8178.slice/crio-c714c59112bfc0e3b6357d9278b4004615e2a6e58ada9a4e7dbe36e55cf744fc WatchSource:0}: Error finding container c714c59112bfc0e3b6357d9278b4004615e2a6e58ada9a4e7dbe36e55cf744fc: Status 404 returned error can't find the container with id c714c59112bfc0e3b6357d9278b4004615e2a6e58ada9a4e7dbe36e55cf744fc Apr 28 20:24:55.709474 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:55.709441 2578 generic.go:358] "Generic (PLEG): container finished" podID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerID="242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff" exitCode=0 Apr 28 20:24:55.709837 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:55.709504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" event={"ID":"d22b0dd2-788c-4dda-871c-7be54f9f8178","Type":"ContainerDied","Data":"242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff"} Apr 28 20:24:55.709837 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:55.709539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" event={"ID":"d22b0dd2-788c-4dda-871c-7be54f9f8178","Type":"ContainerStarted","Data":"c714c59112bfc0e3b6357d9278b4004615e2a6e58ada9a4e7dbe36e55cf744fc"} Apr 28 20:24:56.715058 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:56.715026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" event={"ID":"d22b0dd2-788c-4dda-871c-7be54f9f8178","Type":"ContainerStarted","Data":"f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82"} Apr 28 20:24:56.715058 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:56.715063 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" event={"ID":"d22b0dd2-788c-4dda-871c-7be54f9f8178","Type":"ContainerStarted","Data":"ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791"} Apr 28 20:24:56.715489 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:56.715096 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:24:56.736072 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:24:56.736019 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" podStartSLOduration=2.73600367 podStartE2EDuration="2.73600367s" podCreationTimestamp="2026-04-28 20:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:24:56.734805932 +0000 UTC m=+4098.085247336" watchObservedRunningTime="2026-04-28 20:24:56.73600367 +0000 UTC m=+4098.086445077" Apr 28 20:25:04.700972 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:25:04.700933 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:25:04.700972 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:25:04.700983 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:25:04.703496 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:25:04.703469 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:25:04.740854 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:25:04.740825 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:25:25.745729 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:25:25.745623 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:26:39.366709 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:26:39.366682 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:26:39.378443 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:26:39.378423 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:31:39.387075 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:31:39.386960 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:31:39.399093 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:31:39.399074 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:36:39.406725 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:36:39.406595 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:36:39.426724 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:36:39.426698 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:37:32.458304 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:32.458271 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb"] Apr 28 20:37:32.458832 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:32.458567 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="main" containerID="cri-o://ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791" gracePeriod=30 Apr 28 20:37:32.458832 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:32.458669 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="tokenizer" containerID="cri-o://f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82" gracePeriod=30 Apr 28 20:37:33.056194 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.056154 2578 generic.go:358] "Generic (PLEG): container finished" podID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerID="ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791" exitCode=0 Apr 28 20:37:33.056361 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.056233 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" event={"ID":"d22b0dd2-788c-4dda-871c-7be54f9f8178","Type":"ContainerDied","Data":"ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791"} Apr 28 20:37:33.608510 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.608489 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:37:33.675484 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675447 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-uds\") pod \"d22b0dd2-788c-4dda-871c-7be54f9f8178\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " Apr 28 20:37:33.675484 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675492 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-kserve-provision-location\") pod \"d22b0dd2-788c-4dda-871c-7be54f9f8178\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " Apr 28 20:37:33.675695 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675519 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d22b0dd2-788c-4dda-871c-7be54f9f8178-tls-certs\") pod \"d22b0dd2-788c-4dda-871c-7be54f9f8178\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " Apr 28 20:37:33.675695 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675550 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ljvr\" (UniqueName: \"kubernetes.io/projected/d22b0dd2-788c-4dda-871c-7be54f9f8178-kube-api-access-7ljvr\") pod \"d22b0dd2-788c-4dda-871c-7be54f9f8178\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " Apr 28 20:37:33.675695 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675587 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-tmp\") pod \"d22b0dd2-788c-4dda-871c-7be54f9f8178\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " Apr 28 20:37:33.675695 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675657 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-cache\") pod \"d22b0dd2-788c-4dda-871c-7be54f9f8178\" (UID: \"d22b0dd2-788c-4dda-871c-7be54f9f8178\") " Apr 28 20:37:33.675906 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675726 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d22b0dd2-788c-4dda-871c-7be54f9f8178" (UID: "d22b0dd2-788c-4dda-871c-7be54f9f8178"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.675906 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675896 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-uds\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.676014 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675929 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d22b0dd2-788c-4dda-871c-7be54f9f8178" (UID: "d22b0dd2-788c-4dda-871c-7be54f9f8178"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.676014 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.675999 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d22b0dd2-788c-4dda-871c-7be54f9f8178" (UID: "d22b0dd2-788c-4dda-871c-7be54f9f8178"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.676312 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.676291 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d22b0dd2-788c-4dda-871c-7be54f9f8178" (UID: "d22b0dd2-788c-4dda-871c-7be54f9f8178"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.677823 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.677782 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22b0dd2-788c-4dda-871c-7be54f9f8178-kube-api-access-7ljvr" (OuterVolumeSpecName: "kube-api-access-7ljvr") pod "d22b0dd2-788c-4dda-871c-7be54f9f8178" (UID: "d22b0dd2-788c-4dda-871c-7be54f9f8178"). InnerVolumeSpecName "kube-api-access-7ljvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:37:33.677936 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.677915 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22b0dd2-788c-4dda-871c-7be54f9f8178-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d22b0dd2-788c-4dda-871c-7be54f9f8178" (UID: "d22b0dd2-788c-4dda-871c-7be54f9f8178"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:37:33.776777 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.776754 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-tmp\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.776777 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.776775 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-tokenizer-cache\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.776910 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.776784 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d22b0dd2-788c-4dda-871c-7be54f9f8178-kserve-provision-location\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.776910 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.776794 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d22b0dd2-788c-4dda-871c-7be54f9f8178-tls-certs\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.776910 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:33.776803 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7ljvr\" (UniqueName: \"kubernetes.io/projected/d22b0dd2-788c-4dda-871c-7be54f9f8178-kube-api-access-7ljvr\") on node \"ip-10-0-143-22.ec2.internal\" DevicePath \"\"" Apr 28 20:37:34.062368 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.062271 2578 generic.go:358] "Generic (PLEG): container finished" podID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerID="f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82" exitCode=0 Apr 28 20:37:34.062537 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.062357 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" event={"ID":"d22b0dd2-788c-4dda-871c-7be54f9f8178","Type":"ContainerDied","Data":"f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82"} Apr 28 20:37:34.062537 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.062381 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" Apr 28 20:37:34.062537 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.062409 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb" event={"ID":"d22b0dd2-788c-4dda-871c-7be54f9f8178","Type":"ContainerDied","Data":"c714c59112bfc0e3b6357d9278b4004615e2a6e58ada9a4e7dbe36e55cf744fc"} Apr 28 20:37:34.062537 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.062430 2578 scope.go:117] "RemoveContainer" containerID="f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82" Apr 28 20:37:34.070876 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.070858 2578 scope.go:117] "RemoveContainer" containerID="ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791" Apr 28 20:37:34.078087 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.078069 2578 scope.go:117] "RemoveContainer" containerID="242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff" Apr 28 20:37:34.084273 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.084251 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb"] Apr 28 20:37:34.085389 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.085370 2578 scope.go:117] "RemoveContainer" containerID="f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82" Apr 28 20:37:34.085645 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:37:34.085611 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82\": container with ID starting with f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82 not found: ID does not exist" containerID="f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82" Apr 28 20:37:34.085700 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.085659 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82"} err="failed to get container status \"f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82\": rpc error: code = NotFound desc = could not find container \"f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82\": container with ID starting with f943e4be32e18c0b67eb639b122f6bf5022605f89e8593c066bb16efaed51d82 not found: ID does not exist" Apr 28 20:37:34.085700 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.085685 2578 scope.go:117] "RemoveContainer" containerID="ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791" Apr 28 20:37:34.086018 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:37:34.085980 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791\": container with ID starting with ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791 not found: ID does not exist" containerID="ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791" Apr 28 20:37:34.086149 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.086017 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791"} err="failed to get container status \"ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791\": rpc error: code = NotFound desc = could not find container \"ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791\": container with ID starting with ff1eeb06ebc149f75806fe9e7bfe39502975f932a724527fca849f76956db791 not found: ID does not exist" Apr 28 20:37:34.086149 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.086041 2578 scope.go:117] "RemoveContainer" containerID="242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff" Apr 28 20:37:34.086557 ip-10-0-143-22 kubenswrapper[2578]: E0428 20:37:34.086534 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff\": container with ID starting with 242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff not found: ID does not exist" containerID="242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff" Apr 28 20:37:34.086684 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.086562 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff"} err="failed to get container status \"242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff\": rpc error: code = NotFound desc = could not find container \"242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff\": container with ID starting with 242a612cee9d043508f3bd91611a0d8a6ba0f98f8082d3d34b89825f054b67ff not found: ID does not exist" Apr 28 20:37:34.087792 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:34.087774 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schek5blb"] Apr 28 20:37:35.195711 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:35.195680 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" path="/var/lib/kubelet/pods/d22b0dd2-788c-4dda-871c-7be54f9f8178/volumes" Apr 28 20:37:48.417228 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:48.417191 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:48.445104 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:48.445077 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:48.450291 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:48.450273 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:49.429475 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:49.429443 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:49.445557 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:49.445527 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:49.451516 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:49.451489 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:50.421148 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:50.421118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:50.449999 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:50.449972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:50.457147 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:50.457121 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:51.408517 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:51.408484 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:51.425068 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:51.425037 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:51.430107 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:51.430087 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:52.381757 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:52.381725 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:52.396260 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:52.396238 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:52.402335 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:52.402315 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:53.380001 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:53.379965 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:53.393582 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:53.393553 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:53.399066 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:53.399040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:54.355794 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:54.355760 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:54.370400 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:54.370375 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:54.375699 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:54.375666 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:55.344931 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:55.344897 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:55.360173 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:55.360146 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:55.364993 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:55.364954 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:56.321597 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:56.321565 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:56.336619 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:56.336590 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:56.341440 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:56.341421 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:57.317360 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:57.317332 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:57.333368 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:57.333343 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:57.339540 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:57.339519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:58.291190 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:58.291158 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:58.305377 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:58.305356 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:58.310267 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:58.310249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:37:59.306099 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:59.306065 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:37:59.320618 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:59.320583 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:37:59.326158 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:37:59.326138 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:38:00.372412 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:00.372381 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:38:00.387830 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:00.387800 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:38:00.395161 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:00.395132 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:38:01.403200 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:01.403171 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/main/0.log" Apr 28 20:38:01.417439 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:01.417413 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/tokenizer/0.log" Apr 28 20:38:01.424953 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:01.424932 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-7bd9d964b9-wtpb6_c0741a9a-c65a-4b1c-a03c-f366d5f0fbcd/storage-initializer/0.log" Apr 28 20:38:04.766897 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:04.766867 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-jttlm_8c8b757a-a931-4552-98ef-8d189e70abc4/kuadrant-console-plugin/0.log" Apr 28 20:38:04.804854 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:04.804824 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-kjjnk_61657b7e-2014-41bd-9f17-c3184bd201f7/limitador/0.log" Apr 28 20:38:05.627781 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:05.627754 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-jttlm_8c8b757a-a931-4552-98ef-8d189e70abc4/kuadrant-console-plugin/0.log" Apr 28 20:38:05.672650 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:05.672601 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-kjjnk_61657b7e-2014-41bd-9f17-c3184bd201f7/limitador/0.log" Apr 28 20:38:06.500843 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:06.500812 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-jttlm_8c8b757a-a931-4552-98ef-8d189e70abc4/kuadrant-console-plugin/0.log" Apr 28 20:38:06.542564 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:06.542541 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-kjjnk_61657b7e-2014-41bd-9f17-c3184bd201f7/limitador/0.log" Apr 28 20:38:07.390474 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:07.390446 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-jttlm_8c8b757a-a931-4552-98ef-8d189e70abc4/kuadrant-console-plugin/0.log" Apr 28 20:38:07.435506 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:07.435481 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-kjjnk_61657b7e-2014-41bd-9f17-c3184bd201f7/limitador/0.log" Apr 28 20:38:08.253407 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:08.253380 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-jttlm_8c8b757a-a931-4552-98ef-8d189e70abc4/kuadrant-console-plugin/0.log" Apr 28 20:38:08.295319 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:08.295288 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-kjjnk_61657b7e-2014-41bd-9f17-c3184bd201f7/limitador/0.log" Apr 28 20:38:13.276524 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:13.276493 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7mtgn_3eaec6bb-3277-478e-9ecc-a557fa5a5b7f/global-pull-secret-syncer/0.log" Apr 28 20:38:13.443158 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:13.443123 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-p5qkw_88c3f56d-6859-4f8e-a645-45fb36262479/konnectivity-agent/0.log" Apr 28 20:38:13.590601 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:13.590483 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-22.ec2.internal_530700f112f77c890973fb51d737f28d/haproxy/0.log" Apr 28 20:38:17.353854 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:17.353825 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-jttlm_8c8b757a-a931-4552-98ef-8d189e70abc4/kuadrant-console-plugin/0.log" Apr 28 20:38:17.423702 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:17.423669 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-kjjnk_61657b7e-2014-41bd-9f17-c3184bd201f7/limitador/0.log" Apr 28 20:38:18.896107 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:18.896073 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pgxqf_d8445255-e1df-44af-92d4-781af5f9f6b1/node-exporter/0.log" Apr 28 20:38:18.916207 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:18.916179 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pgxqf_d8445255-e1df-44af-92d4-781af5f9f6b1/kube-rbac-proxy/0.log" Apr 28 20:38:18.932991 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:18.932967 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pgxqf_d8445255-e1df-44af-92d4-781af5f9f6b1/init-textfile/0.log" Apr 28 20:38:20.770066 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:20.770030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-j9zgn_92170fe7-3ead-4d08-90fa-aa8b6a8f3a4f/networking-console-plugin/0.log" Apr 28 20:38:22.113594 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113558 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57"] Apr 28 20:38:22.113987 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113870 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="main" Apr 28 20:38:22.113987 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113882 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="main" Apr 28 20:38:22.113987 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113912 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="tokenizer" Apr 28 20:38:22.113987 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113918 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="tokenizer" Apr 28 20:38:22.113987 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113927 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="storage-initializer" Apr 28 20:38:22.113987 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113933 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="storage-initializer" Apr 28 20:38:22.113987 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113981 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="main" Apr 28 20:38:22.113987 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.113989 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d22b0dd2-788c-4dda-871c-7be54f9f8178" containerName="tokenizer" Apr 28 20:38:22.116902 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.116879 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.119464 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.119441 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjc6s\"/\"openshift-service-ca.crt\"" Apr 28 20:38:22.120653 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.120619 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjc6s\"/\"kube-root-ca.crt\"" Apr 28 20:38:22.120653 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.120645 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hjc6s\"/\"default-dockercfg-nkncc\"" Apr 28 20:38:22.124542 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.124522 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57"] Apr 28 20:38:22.248085 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.248060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-sys\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.248240 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.248104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-proc\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.248240 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.248121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-lib-modules\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.248240 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.248167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-podres\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.248240 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.248229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhwlj\" (UniqueName: \"kubernetes.io/projected/39020868-f69a-43fc-b1a3-6368769565c3-kube-api-access-zhwlj\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.348893 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.348858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhwlj\" (UniqueName: \"kubernetes.io/projected/39020868-f69a-43fc-b1a3-6368769565c3-kube-api-access-zhwlj\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.349091 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.348906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-sys\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.349091 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.348935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-proc\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.349091 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.348951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-lib-modules\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.349091 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.348968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-podres\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.349091 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.349062 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-sys\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.349091 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.349062 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-proc\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.349091 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.349089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-podres\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.349341 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.349130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/39020868-f69a-43fc-b1a3-6368769565c3-lib-modules\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.356770 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.356751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhwlj\" (UniqueName: \"kubernetes.io/projected/39020868-f69a-43fc-b1a3-6368769565c3-kube-api-access-zhwlj\") pod \"perf-node-gather-daemonset-ntn57\" (UID: \"39020868-f69a-43fc-b1a3-6368769565c3\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.428126 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.428065 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:22.547342 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.547309 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57"] Apr 28 20:38:22.550453 ip-10-0-143-22 kubenswrapper[2578]: W0428 20:38:22.550424 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod39020868_f69a_43fc_b1a3_6368769565c3.slice/crio-8e938b2725880c408ae2ee831395759ed3137655e0350b39498721b1b6e98126 WatchSource:0}: Error finding container 8e938b2725880c408ae2ee831395759ed3137655e0350b39498721b1b6e98126: Status 404 returned error can't find the container with id 8e938b2725880c408ae2ee831395759ed3137655e0350b39498721b1b6e98126 Apr 28 20:38:22.552138 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.552117 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:38:22.989456 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:22.989427 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-c55mw_8327b8b7-48d4-4d18-bec4-8cea6c826302/dns/0.log" Apr 28 20:38:23.004111 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:23.004082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-c55mw_8327b8b7-48d4-4d18-bec4-8cea6c826302/kube-rbac-proxy/0.log" Apr 28 20:38:23.053141 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:23.053116 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bgmp8_d5c1a9d5-7a1d-4369-837a-3ed96d5f107f/dns-node-resolver/0.log" Apr 28 20:38:23.219213 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:23.219172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" event={"ID":"39020868-f69a-43fc-b1a3-6368769565c3","Type":"ContainerStarted","Data":"30be8b52c968939f558c9e9758d5df935aaca1f94bbb78ee432f304d5dbd70f7"} Apr 28 20:38:23.219213 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:23.219212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" event={"ID":"39020868-f69a-43fc-b1a3-6368769565c3","Type":"ContainerStarted","Data":"8e938b2725880c408ae2ee831395759ed3137655e0350b39498721b1b6e98126"} Apr 28 20:38:23.219781 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:23.219298 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:23.235046 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:23.235005 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" podStartSLOduration=1.234995571 podStartE2EDuration="1.234995571s" podCreationTimestamp="2026-04-28 20:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:38:23.23417489 +0000 UTC m=+4904.584616308" watchObservedRunningTime="2026-04-28 20:38:23.234995571 +0000 UTC m=+4904.585436976" Apr 28 20:38:23.551378 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:23.551347 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hp74n_692b128d-82a4-4c26-b17d-0b4d804ef295/node-ca/0.log" Apr 28 20:38:24.871846 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:24.871813 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wlbdc_b5ab40ee-0c46-43db-8a80-02e47728a72f/serve-healthcheck-canary/0.log" Apr 28 20:38:25.389106 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:25.389081 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nghpr_9de6d54e-8eab-4890-b6e4-99648fd535fc/kube-rbac-proxy/0.log" Apr 28 20:38:25.406195 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:25.406168 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nghpr_9de6d54e-8eab-4890-b6e4-99648fd535fc/exporter/0.log" Apr 28 20:38:25.422259 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:25.422228 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nghpr_9de6d54e-8eab-4890-b6e4-99648fd535fc/extractor/0.log" Apr 28 20:38:27.924392 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:27.924362 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-79cf4cb497-p49cc_fa308b9b-52b6-41bb-8041-bfb109b7754e/manager/0.log" Apr 28 20:38:28.474871 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:28.474838 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-7c5d89bcd8-hxpht_60e7244c-cb78-487d-af6a-8c33378b5d1c/manager/0.log" Apr 28 20:38:28.491091 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:28.491071 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-tptx9_397c0b79-9faa-41e6-ba3e-eec0cc748074/server/0.log" Apr 28 20:38:28.732015 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:28.731936 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-qj2sh_a72d274a-7708-4123-b7e0-0de0d2ce2e1a/s3-init/0.log" Apr 28 20:38:29.232537 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:29.232503 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-ntn57" Apr 28 20:38:34.664960 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.664905 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4wsc_d18eaae1-d122-4fa3-8b2e-ffc7868bfd03/kube-multus-additional-cni-plugins/0.log" Apr 28 20:38:34.684691 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.684663 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4wsc_d18eaae1-d122-4fa3-8b2e-ffc7868bfd03/egress-router-binary-copy/0.log" Apr 28 20:38:34.701382 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.701354 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4wsc_d18eaae1-d122-4fa3-8b2e-ffc7868bfd03/cni-plugins/0.log" Apr 28 20:38:34.717567 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.717542 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4wsc_d18eaae1-d122-4fa3-8b2e-ffc7868bfd03/bond-cni-plugin/0.log" Apr 28 20:38:34.733431 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.733414 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4wsc_d18eaae1-d122-4fa3-8b2e-ffc7868bfd03/routeoverride-cni/0.log" Apr 28 20:38:34.750215 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.750193 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4wsc_d18eaae1-d122-4fa3-8b2e-ffc7868bfd03/whereabouts-cni-bincopy/0.log" Apr 28 20:38:34.766585 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.766565 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v4wsc_d18eaae1-d122-4fa3-8b2e-ffc7868bfd03/whereabouts-cni/0.log" Apr 28 20:38:34.831344 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.831318 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4ddb_6544e7a1-69d4-41e0-b18d-961cdaa5418d/kube-multus/0.log" Apr 28 20:38:34.851457 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.851432 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2ssxm_96593340-195c-4a9b-8d15-babb74ebf1c6/network-metrics-daemon/0.log" Apr 28 20:38:34.866671 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:34.866651 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2ssxm_96593340-195c-4a9b-8d15-babb74ebf1c6/kube-rbac-proxy/0.log" Apr 28 20:38:36.301147 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.301110 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-controller/0.log" Apr 28 20:38:36.314946 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.314912 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/0.log" Apr 28 20:38:36.359444 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.359420 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovn-acl-logging/1.log" Apr 28 20:38:36.378938 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.378913 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/kube-rbac-proxy-node/0.log" Apr 28 20:38:36.399514 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.399463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 20:38:36.412784 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.412759 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/northd/0.log" Apr 28 20:38:36.428558 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.428532 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/nbdb/0.log" Apr 28 20:38:36.443458 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.443429 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/sbdb/0.log" Apr 28 20:38:36.618031 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:36.617993 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssm92_1fe57666-24f8-4a83-ae5a-59f5b12c7a9e/ovnkube-controller/0.log" Apr 28 20:38:37.639185 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:37.639153 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-k9zr5_27090a69-2cdb-4eae-a82d-5fa7351f8654/network-check-target-container/0.log" Apr 28 20:38:38.610875 ip-10-0-143-22 kubenswrapper[2578]: I0428 20:38:38.610841 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jgrh8_2352f752-8d71-483d-9d43-b79ba63f8cad/iptables-alerter/0.log"