Apr 16 18:09:40.688861 ip-10-0-141-189 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:09:41.074610 ip-10-0-141-189 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:41.074610 ip-10-0-141-189 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:09:41.074610 ip-10-0-141-189 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:41.074610 ip-10-0-141-189 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:09:41.074610 ip-10-0-141-189 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:41.077880 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.077803 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:09:41.082959 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082930 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:41.082959 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082950 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:41.082959 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082956 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:41.082959 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082961 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:41.082959 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082965 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082969 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082973 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082977 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082980 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082984 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082988 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082992 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082995 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.082998 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083002 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083006 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083011 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083015 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083019 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083023 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083027 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083030 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083035 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083039 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:41.083248 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083042 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083046 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083050 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083062 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083068 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083073 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083078 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083082 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083087 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083092 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083096 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083100 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083104 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083109 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083114 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083118 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083122 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083126 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083131 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083135 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:41.084090 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083139 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083144 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083148 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083152 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083157 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083161 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083165 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083169 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083175 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083180 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083184 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083188 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083192 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083197 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083202 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083206 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083210 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083214 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083218 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:41.085022 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083222 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083226 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083230 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083234 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083239 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083243 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083248 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083254 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083259 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083263 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083268 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083275 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083280 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083285 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083289 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083302 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083307 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083310 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083314 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083319 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:41.085891 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083323 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083327 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.083331 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084026 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084035 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084040 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084044 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084048 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084053 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084057 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084062 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084066 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084070 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084075 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084079 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084083 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084087 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084091 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084095 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084100 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:41.086471 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084104 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084107 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084112 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084116 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084120 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084125 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084129 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084141 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084146 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084150 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084154 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084158 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084162 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084167 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084171 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084175 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084179 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084183 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084187 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:41.087059 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084192 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084197 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084201 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084204 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084208 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084212 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084216 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084221 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084225 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084230 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084234 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084238 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084244 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084248 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084252 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084256 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084260 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084265 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084269 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084273 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:41.087748 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084277 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084295 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084302 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084307 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084311 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084316 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084320 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084324 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084328 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084332 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084337 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084341 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084345 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084350 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084354 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084367 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084371 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084375 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084380 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:41.088345 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084384 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084389 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084393 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084399 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084403 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084407 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084412 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084417 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084423 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084428 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.084432 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084542 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084552 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084584 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084591 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084606 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084612 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084619 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084625 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084630 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:09:41.089000 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084635 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084640 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084646 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084651 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084655 2567 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084660 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084665 2567 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084669 2567 flags.go:64] FLAG: --cloud-config="" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084699 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084706 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084718 2567 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084723 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084728 2567 flags.go:64] FLAG: --config-dir="" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084733 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084739 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084745 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084750 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084755 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084760 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084765 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084770 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084774 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084779 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084784 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084791 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:09:41.089491 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084796 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084801 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084805 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084818 2567 flags.go:64] FLAG: --enable-server="true" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084824 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084834 2567 flags.go:64] FLAG: --event-burst="100" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084839 2567 flags.go:64] FLAG: --event-qps="50" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084844 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084848 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084854 2567 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084860 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084865 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084870 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084874 2567 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084879 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084884 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084890 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084894 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084900 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084904 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084909 2567 flags.go:64] FLAG: --feature-gates="" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084916 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084921 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084926 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084931 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084936 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:09:41.090119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084940 2567 flags.go:64] FLAG: --help="false" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084946 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084951 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084956 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084960 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084966 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084971 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084976 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084980 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084985 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.084997 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085003 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085008 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085013 2567 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085018 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085025 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085030 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085034 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085039 2567 flags.go:64] FLAG: --lock-file="" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085043 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085048 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085052 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085065 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:09:41.090766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085070 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085074 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085079 2567 flags.go:64] FLAG: --logging-format="text" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085084 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085090 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085094 2567 flags.go:64] FLAG: --manifest-url="" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085098 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085106 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085111 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085117 2567 flags.go:64] FLAG: --max-pods="110" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085122 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085127 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085131 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085136 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085141 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085145 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085150 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085163 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085167 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085172 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085185 2567 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085191 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085199 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085204 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:09:41.091333 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085213 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085218 2567 flags.go:64] FLAG: --port="10250" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085223 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085228 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-026c8addc02f7fd15" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085233 2567 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085239 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085243 2567 flags.go:64] FLAG: --register-node="true" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085248 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085254 2567 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085260 2567 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085265 2567 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085269 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085274 2567 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085280 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085285 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085290 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085294 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085299 2567 flags.go:64] FLAG: --runonce="false" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085304 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085309 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085314 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085319 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085323 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085328 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085333 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085338 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:09:41.091980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085342 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085347 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085351 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085367 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085372 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085377 2567 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085384 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085393 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085397 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085402 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085412 2567 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085417 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085421 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085426 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085432 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085437 2567 flags.go:64] FLAG: --v="2" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085444 2567 flags.go:64] FLAG: --version="false" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085450 2567 flags.go:64] FLAG: --vmodule="" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085456 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.085462 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085659 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085667 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085672 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085677 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:41.092637 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085681 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085686 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085690 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085696 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085701 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085705 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085709 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085713 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085718 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085722 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085726 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085730 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085745 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085750 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085756 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085761 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085766 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085770 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085774 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085778 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:41.093218 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085783 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085787 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085793 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085797 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085801 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085806 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085810 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085815 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085819 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085823 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085827 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085831 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085836 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085840 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085846 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085853 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085857 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085862 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085866 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085870 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:41.093784 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085875 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085879 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085883 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085887 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085891 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085902 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085907 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085912 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085916 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085920 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085924 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085930 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085936 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085941 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085948 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085953 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085957 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085962 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085966 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:41.094276 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085970 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085975 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085979 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085983 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085987 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085991 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085995 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.085999 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086003 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086007 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086012 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086016 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086020 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086024 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086028 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086033 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086037 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086041 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086045 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:41.094770 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086058 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086064 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086068 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.086072 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.086692 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.092813 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.092827 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092870 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092875 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092878 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092881 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092885 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092889 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092892 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092895 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:41.095243 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092898 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092901 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092904 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092906 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092909 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092912 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092914 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092917 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092921 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092924 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092927 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092929 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092939 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092942 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092945 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092947 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092950 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092953 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092955 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092959 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:41.095645 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092962 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092964 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092967 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092976 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092979 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092982 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092984 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092987 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092989 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092992 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092995 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.092997 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093000 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093002 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093005 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093007 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093010 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093012 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093015 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093017 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:41.096147 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093020 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093023 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093025 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093028 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093030 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093033 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093035 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093038 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093040 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093043 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093045 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093048 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093050 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093053 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093055 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093059 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093065 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093068 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093070 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093073 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:41.096695 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093075 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093078 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093080 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093083 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093086 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093088 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093091 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093093 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093096 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093098 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093101 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093104 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093106 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093109 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093111 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093114 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093117 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:41.097194 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093119 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.093124 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093226 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093232 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093235 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093238 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093241 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093244 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093247 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093249 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093252 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093255 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093258 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093261 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093263 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:41.097658 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093266 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093268 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093271 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093273 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093276 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093279 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093281 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093284 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093286 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093289 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093291 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093293 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093296 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093298 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093301 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093303 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093306 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093308 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093311 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093314 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:41.098034 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093316 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093320 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093323 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093326 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093328 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093331 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093333 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093336 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093339 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093341 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093344 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093347 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093350 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093352 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093355 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093357 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093360 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093362 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093365 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093367 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:41.098529 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093370 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093372 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093375 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093377 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093380 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093382 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093385 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093388 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093390 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093393 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093395 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093398 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093400 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093402 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093405 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093407 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093410 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093413 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093415 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093418 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:41.099043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093421 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093424 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093427 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093430 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093432 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093435 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093437 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093440 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093442 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093445 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093447 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093450 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:41.093452 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.093457 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.094110 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:09:41.099528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.097535 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:09:41.099913 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.098403 2567 server.go:1019] "Starting client certificate rotation" Apr 16 18:09:41.099913 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.098523 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:41.099913 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.098573 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:41.120639 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.120619 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:41.122997 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.122978 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:41.139973 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.139955 2567 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:09:41.144761 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.144747 2567 log.go:25] "Validated CRI v1 image API" Apr 16 18:09:41.146569 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.146539 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:09:41.149951 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.149931 2567 fs.go:135] Filesystem UUIDs: map[3adae11a-5e4a-45fe-bfea-4bd71feb8a6e:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 eb4a59a1-e6d6-4638-a252-7e6c1f1cadaa:/dev/nvme0n1p4] Apr 16 18:09:41.149998 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.149952 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:09:41.155351 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155244 2567 manager.go:217] Machine: {Timestamp:2026-04-16 18:09:41.153491337 +0000 UTC m=+0.357843155 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3094935 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2dd3c37ea02a91a841a635f0fa25d5 SystemUUID:ec2dd3c3-7ea0-2a91-a841-a635f0fa25d5 BootID:939de2d6-e110-49fb-ae03-f8e5087c7e34 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a3:30:2c:48:ff Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a3:30:2c:48:ff Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:19:6d:c7:22:ae Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:09:41.155351 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155338 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:09:41.155506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155438 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:09:41.155506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155451 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:41.155748 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155730 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:09:41.155886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155749 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-189.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:09:41.155931 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155895 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:09:41.155931 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155903 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:09:41.155931 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.155915 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:41.156573 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.156547 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:41.157471 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.157462 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:41.157595 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.157586 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:09:41.159729 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.159720 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:09:41.159763 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.159737 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:09:41.159763 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.159748 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:09:41.159763 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.159757 2567 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:09:41.159862 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.159766 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:09:41.160749 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.160737 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:41.160787 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.160756 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:41.163331 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.163316 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:09:41.165084 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.165071 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:09:41.166367 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166356 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166375 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166381 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166387 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166393 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166398 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166404 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166410 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166416 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:09:41.166418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166422 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:09:41.166682 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166438 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:09:41.166682 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.166447 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:09:41.167179 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.167169 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:09:41.167179 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.167179 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:09:41.170544 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.170498 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:09:41.170611 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.170582 2567 server.go:1295] "Started kubelet" Apr 16 18:09:41.170777 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.170729 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:09:41.170852 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.170799 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:09:41.170968 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.170945 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:09:41.171350 ip-10-0-141-189 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:09:41.171990 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.171974 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:09:41.175281 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.175252 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-189.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:09:41.175377 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.175364 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-189.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:09:41.175612 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.175536 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:09:41.176882 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.176864 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:09:41.179623 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.179600 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:41.179955 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.179932 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:09:41.180091 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.180077 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:09:41.180494 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.179525 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-189.ec2.internal.18a6e8b844d2f796 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-189.ec2.internal,UID:ip-10-0-141-189.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-189.ec2.internal,},FirstTimestamp:2026-04-16 18:09:41.170542486 +0000 UTC m=+0.374894306,LastTimestamp:2026-04-16 18:09:41.170542486 +0000 UTC m=+0.374894306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-189.ec2.internal,}" Apr 16 18:09:41.180706 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.180691 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:09:41.180764 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.180699 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:09:41.180764 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.180720 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:09:41.180846 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.180808 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:09:41.180846 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.180819 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:09:41.180945 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.180874 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.180945 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.180919 2567 factory.go:153] Registering CRI-O factory Apr 16 18:09:41.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.180960 2567 factory.go:223] Registration of the crio container factory successfully Apr 16 18:09:41.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.181002 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:09:41.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.181009 2567 factory.go:55] Registering systemd factory Apr 16 18:09:41.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.181014 2567 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:09:41.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.181036 2567 factory.go:103] Registering Raw factory Apr 16 18:09:41.181248 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.181049 2567 manager.go:1196] Started watching for new ooms in manager Apr 16 18:09:41.181470 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.181454 2567 manager.go:319] Starting recovery of all containers Apr 16 18:09:41.189155 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.189136 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bl8x6" Apr 16 18:09:41.189478 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.189446 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:09:41.189686 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.189663 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-189.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:09:41.191375 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.191360 2567 manager.go:324] Recovery completed Apr 16 18:09:41.195518 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.195502 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.198102 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.198089 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.198148 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.198115 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.198148 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.198127 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.198515 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.198499 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bl8x6" Apr 16 18:09:41.198689 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.198675 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:09:41.198729 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.198691 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:09:41.198729 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.198708 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:41.202615 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.202599 2567 policy_none.go:49] "None policy: Start" Apr 16 18:09:41.202615 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.202615 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:09:41.202810 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.202625 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:09:41.236148 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.236133 2567 manager.go:341] "Starting Device Plugin manager" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.236165 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.236175 2567 server.go:85] "Starting device plugin registration server" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.236398 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.236407 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.236491 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.236599 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.236609 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.237157 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:09:41.258958 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.237189 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.304694 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.304663 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:09:41.305744 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.305724 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:09:41.305801 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.305755 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:09:41.305801 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.305780 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:09:41.305801 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.305790 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:09:41.305931 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.305863 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:09:41.309823 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.309808 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:41.336925 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.336889 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.337607 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.337593 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.337657 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.337619 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.337657 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.337629 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.337657 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.337649 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.346737 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.346720 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.346778 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.346744 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-189.ec2.internal\": node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.366542 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.366525 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.406467 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.406447 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal"] Apr 16 18:09:41.406533 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.406500 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.412991 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.412976 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.413085 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.413001 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.413085 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.413010 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.415209 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415196 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.415354 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415339 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.415414 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415373 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.415928 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415898 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.415928 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415902 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.415928 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415926 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.416069 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415939 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.416069 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415942 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.416069 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.415956 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.418645 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.418630 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.418720 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.418653 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:41.419240 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.419226 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:41.419296 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.419249 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:41.419296 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.419261 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:41.440158 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.440136 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-189.ec2.internal\" not found" node="ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.444462 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.444448 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-189.ec2.internal\" not found" node="ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.467220 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.467204 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.483218 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.483198 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c187cd16f95e6d697a69aca59443285-config\") pod \"kube-apiserver-proxy-ip-10-0-141-189.ec2.internal\" (UID: \"2c187cd16f95e6d697a69aca59443285\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.483277 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.483223 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8e688c1981efc927195def8244abca8f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"8e688c1981efc927195def8244abca8f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.483277 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.483241 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e688c1981efc927195def8244abca8f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"8e688c1981efc927195def8244abca8f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.567478 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.567451 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.583899 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.583881 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8e688c1981efc927195def8244abca8f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"8e688c1981efc927195def8244abca8f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.583945 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.583906 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e688c1981efc927195def8244abca8f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"8e688c1981efc927195def8244abca8f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.583945 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.583923 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c187cd16f95e6d697a69aca59443285-config\") pod \"kube-apiserver-proxy-ip-10-0-141-189.ec2.internal\" (UID: \"2c187cd16f95e6d697a69aca59443285\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.584007 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.583954 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c187cd16f95e6d697a69aca59443285-config\") pod \"kube-apiserver-proxy-ip-10-0-141-189.ec2.internal\" (UID: \"2c187cd16f95e6d697a69aca59443285\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.584007 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.583969 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8e688c1981efc927195def8244abca8f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"8e688c1981efc927195def8244abca8f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.584007 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.583976 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e688c1981efc927195def8244abca8f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"8e688c1981efc927195def8244abca8f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.668281 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.668242 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.743798 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.743773 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.747392 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:41.747375 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:41.768466 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.768438 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.868987 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.868968 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:41.969469 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:41.969421 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:42.025782 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.025761 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:42.070050 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:42.070031 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:42.099481 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.099467 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:09:42.100006 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.099590 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:09:42.100006 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.099592 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:09:42.171026 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:42.170905 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:42.180439 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.180423 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:42.194239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.194217 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:42.200744 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.200718 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:04:41 +0000 UTC" deadline="2028-01-24 15:19:47.167097698 +0000 UTC" Apr 16 18:09:42.200744 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.200744 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15549h10m4.96635687s" Apr 16 18:09:42.208839 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.208822 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:42.222364 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.222296 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kf5zd" Apr 16 18:09:42.228741 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.228724 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kf5zd" Apr 16 18:09:42.272078 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:42.272061 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:42.289631 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:42.289609 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e688c1981efc927195def8244abca8f.slice/crio-1373142f18c209dec7f2f97c07c82168cfa058ce117a47fbaf8036cb3d256455 WatchSource:0}: Error finding container 1373142f18c209dec7f2f97c07c82168cfa058ce117a47fbaf8036cb3d256455: Status 404 returned error can't find the container with id 1373142f18c209dec7f2f97c07c82168cfa058ce117a47fbaf8036cb3d256455 Apr 16 18:09:42.299749 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.299733 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:09:42.307390 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:42.307367 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c187cd16f95e6d697a69aca59443285.slice/crio-a7dfcdfe6c0fbb5f57e4cd1f6ff1166c9ea4125ea95d90643222ccca385957a5 WatchSource:0}: Error finding container a7dfcdfe6c0fbb5f57e4cd1f6ff1166c9ea4125ea95d90643222ccca385957a5: Status 404 returned error can't find the container with id a7dfcdfe6c0fbb5f57e4cd1f6ff1166c9ea4125ea95d90643222ccca385957a5 Apr 16 18:09:42.308364 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.308324 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" event={"ID":"8e688c1981efc927195def8244abca8f","Type":"ContainerStarted","Data":"1373142f18c209dec7f2f97c07c82168cfa058ce117a47fbaf8036cb3d256455"} Apr 16 18:09:42.372640 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:42.372608 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:42.473125 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:42.473083 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 16 18:09:42.498446 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.498426 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:42.580871 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.580855 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 16 18:09:42.592342 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.592327 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:09:42.593325 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.593313 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 16 18:09:42.602380 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:42.602361 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:09:43.161058 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.160985 2567 apiserver.go:52] "Watching apiserver" Apr 16 18:09:43.171111 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.171083 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:09:43.171512 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.171439 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-mq7lv","openshift-network-operator/iptables-alerter-ttkb5","openshift-ovn-kubernetes/ovnkube-node-htm65","kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd","openshift-cluster-node-tuning-operator/tuned-nd572","openshift-dns/node-resolver-lc2jd","openshift-image-registry/node-ca-87pfd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal","openshift-multus/multus-6cp9f","openshift-multus/multus-additional-cni-plugins-j5mwl","kube-system/konnectivity-agent-ptp6j","openshift-multus/network-metrics-daemon-bbpzd"] Apr 16 18:09:43.176853 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.176831 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.179301 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.179275 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.181473 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.181454 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.181630 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.181617 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.184409 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.184258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.184409 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.184299 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:09:43.184570 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.184482 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:09:43.184627 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.184616 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:09:43.185100 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.184926 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:09:43.185100 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.184980 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-srkb8\"" Apr 16 18:09:43.186591 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.186573 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.188866 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.188842 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.190882 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.190862 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-log-socket\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.190970 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.190897 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-kubelet-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.190970 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.190925 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-system-cni-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.190970 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.190948 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/679d211e-bc5a-4f8c-a0ae-1aae973fb746-iptables-alerter-script\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.190970 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.190966 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-systemd-units\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191146 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191005 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-env-overrides\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191146 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191045 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-cni-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191146 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191074 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1985221-eb0c-4868-b1f9-55585c1796dc-cni-binary-copy\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191146 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191099 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-etc-kubernetes\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191146 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191122 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-etc-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191145 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovnkube-script-lib\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191212 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh2kf\" (UniqueName: \"kubernetes.io/projected/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-kube-api-access-wh2kf\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191247 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-etc-selinux\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191272 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-k8s-cni-cncf-io\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191295 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-kubelet\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191338 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-multus-certs\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191353 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-node-log\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191412 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-hostroot\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191448 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-daemon-config\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191474 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-socket-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191491 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/679d211e-bc5a-4f8c-a0ae-1aae973fb746-host-slash\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191504 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-kubelet\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191517 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-systemd\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191594 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-os-release\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-netns\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191639 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsrgw\" (UniqueName: \"kubernetes.io/projected/679d211e-bc5a-4f8c-a0ae-1aae973fb746-kube-api-access-fsrgw\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191660 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-run-netns\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191675 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovnkube-config\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovn-node-metrics-cert\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-registration-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191748 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmwl8\" (UniqueName: \"kubernetes.io/projected/e1b50604-060e-416e-b904-75ab9b75d209-kube-api-access-lmwl8\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.191771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191775 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-cnibin\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191801 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-cni-bin\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191833 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-cni-netd\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191860 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-sys-fs\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191882 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-cni-bin\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-conf-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191937 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qw95\" (UniqueName: \"kubernetes.io/projected/b1985221-eb0c-4868-b1f9-55585c1796dc-kube-api-access-9qw95\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191959 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-ovn\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.191982 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-device-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.192010 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-socket-dir-parent\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.192024 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-cni-multus\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.192045 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-slash\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.192079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-var-lib-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.192596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.192488 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:09:43.193468 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.193273 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:09:43.193468 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.193299 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:09:43.193468 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.193351 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:09:43.193468 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.193368 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kt97x\"" Apr 16 18:09:43.194268 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.194061 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:43.194268 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.194149 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:09:43.194414 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.194309 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:09:43.194414 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.194393 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:09:43.194673 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.194653 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:09:43.194771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.194732 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:09:43.194822 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.194778 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:09:43.195782 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.195598 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mg4vw\"" Apr 16 18:09:43.195782 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.195625 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:09:43.195782 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.195606 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:09:43.195974 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.195860 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:09:43.195974 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.195921 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kbf56\"" Apr 16 18:09:43.195974 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.195951 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:09:43.196114 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.196017 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jphs5\"" Apr 16 18:09:43.196114 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.196077 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:09:43.196114 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.196078 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:09:43.196256 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.196133 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-svd72\"" Apr 16 18:09:43.196878 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.196499 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:09:43.196878 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.196828 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:09:43.196878 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.196841 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:09:43.197091 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.196975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:09:43.197091 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.197064 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:09:43.197199 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.197099 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8bqjb\"" Apr 16 18:09:43.198389 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.197796 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.201074 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.201034 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:43.201154 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.201106 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:43.203067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.203047 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:09:43.203283 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.203257 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-95cd9\"" Apr 16 18:09:43.203466 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.203452 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:09:43.203606 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.203552 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:09:43.204219 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.204199 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dlljj\"" Apr 16 18:09:43.204430 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.204412 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:09:43.229862 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.229834 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:04:42 +0000 UTC" deadline="2028-01-24 19:32:49.468562722 +0000 UTC" Apr 16 18:09:43.229862 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.229861 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15553h23m6.23870508s" Apr 16 18:09:43.282469 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.282448 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:09:43.292471 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292442 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-etc-kubernetes\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.292604 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292478 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-systemd\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.292604 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292494 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-host\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.292604 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdpld\" (UniqueName: \"kubernetes.io/projected/c4bda706-5aa8-4750-aa51-1fd47724ec81-kube-api-access-zdpld\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.292604 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292548 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-etc-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.292604 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292578 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-etc-kubernetes\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.292604 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292603 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-etc-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292636 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovnkube-script-lib\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-etc-selinux\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292705 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-lib-modules\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292729 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/146da900-7e40-40db-9efe-695d918b8758-etc-tuned\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292780 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-etc-selinux\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292808 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw7fc\" (UniqueName: \"kubernetes.io/projected/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-kube-api-access-gw7fc\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292835 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-cni-binary-copy\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-hostroot\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.292903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292888 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysconfig\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292916 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-hostroot\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-socket-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.292976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/679d211e-bc5a-4f8c-a0ae-1aae973fb746-host-slash\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293010 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/679d211e-bc5a-4f8c-a0ae-1aae973fb746-host-slash\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-kubelet\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293098 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-os-release\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293114 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-netns\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovn-node-metrics-cert\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-cnibin\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-kubelet\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293185 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-kubernetes\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-os-release\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293209 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysctl-d\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovnkube-script-lib\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293234 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-sys\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293255 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-cnibin\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293258 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-cnibin\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.293415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293224 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-socket-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293280 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-netns\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293290 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293322 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-cni-bin\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-conf-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293369 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-cni-bin\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293372 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293402 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-conf-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-device-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293429 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-socket-dir-parent\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-device-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293454 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysctl-conf\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-serviceca\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293492 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-os-release\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293486 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293518 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-slash\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293504 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-socket-dir-parent\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.294239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293545 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-var-lib-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293579 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-slash\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-kubelet-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293608 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-var-lib-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293642 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-modprobe-d\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-kubelet-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-cni-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293706 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb1eecb7-d44f-4eaa-8a18-4e159e76feaa-konnectivity-ca\") pod \"konnectivity-agent-ptp6j\" (UID: \"bb1eecb7-d44f-4eaa-8a18-4e159e76feaa\") " pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh2kf\" (UniqueName: \"kubernetes.io/projected/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-kube-api-access-wh2kf\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293761 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-cni-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293779 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-k8s-cni-cncf-io\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293829 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-kubelet\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-multus-certs\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293832 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-k8s-cni-cncf-io\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293899 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-kubelet\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-node-log\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293936 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-node-log\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-run-multus-certs\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293973 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-daemon-config\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.293985 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294002 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-var-lib-kubelet\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-systemd\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294115 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/146da900-7e40-40db-9efe-695d918b8758-tmp\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294159 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24jq\" (UniqueName: \"kubernetes.io/projected/edb0e160-57f6-4631-88ff-4d22d6b51543-kube-api-access-q24jq\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294182 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-system-cni-dir\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-systemd\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsrgw\" (UniqueName: \"kubernetes.io/projected/679d211e-bc5a-4f8c-a0ae-1aae973fb746-kube-api-access-fsrgw\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294227 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-openvswitch\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294247 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-run-netns\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294274 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovnkube-config\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294298 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-registration-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.295840 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294324 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmwl8\" (UniqueName: \"kubernetes.io/projected/e1b50604-060e-416e-b904-75ab9b75d209-kube-api-access-lmwl8\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294347 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-cni-bin\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294373 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/edb0e160-57f6-4631-88ff-4d22d6b51543-hosts-file\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-cni-netd\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294408 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-run-netns\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294423 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1985221-eb0c-4868-b1f9-55585c1796dc-multus-daemon-config\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294462 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-host-cni-netd\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-sys-fs\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-sys-fs\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-cni-bin\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294506 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qw95\" (UniqueName: \"kubernetes.io/projected/b1985221-eb0c-4868-b1f9-55585c1796dc-kube-api-access-9qw95\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294514 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1b50604-060e-416e-b904-75ab9b75d209-registration-dir\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294534 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294583 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-ovn\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294637 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-cni-multus\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294664 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb1eecb7-d44f-4eaa-8a18-4e159e76feaa-agent-certs\") pod \"konnectivity-agent-ptp6j\" (UID: \"bb1eecb7-d44f-4eaa-8a18-4e159e76feaa\") " pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:09:43.296349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294690 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-log-socket\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-host-var-lib-cni-multus\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294715 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-system-cni-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294735 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-run-ovn\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294736 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-log-socket\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-run\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294804 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-host\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294842 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovnkube-config\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294855 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxk4v\" (UniqueName: \"kubernetes.io/projected/146da900-7e40-40db-9efe-695d918b8758-kube-api-access-hxk4v\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294868 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1985221-eb0c-4868-b1f9-55585c1796dc-system-cni-dir\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294903 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edb0e160-57f6-4631-88ff-4d22d6b51543-tmp-dir\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8klrd\" (UniqueName: \"kubernetes.io/projected/d4d545a6-5b19-4165-9bd6-f5c19acf145a-kube-api-access-8klrd\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294970 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/679d211e-bc5a-4f8c-a0ae-1aae973fb746-iptables-alerter-script\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.294995 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-systemd-units\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.295016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-env-overrides\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.295040 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1985221-eb0c-4868-b1f9-55585c1796dc-cni-binary-copy\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.295090 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-systemd-units\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.296886 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.295460 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1985221-eb0c-4868-b1f9-55585c1796dc-cni-binary-copy\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.297380 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.295479 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/679d211e-bc5a-4f8c-a0ae-1aae973fb746-iptables-alerter-script\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.297380 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.295498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-env-overrides\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.297380 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.297272 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-ovn-node-metrics-cert\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.303232 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.303207 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh2kf\" (UniqueName: \"kubernetes.io/projected/9bd668bd-f016-4d96-a5ed-1376b7d9e8ec-kube-api-access-wh2kf\") pod \"ovnkube-node-htm65\" (UID: \"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.303351 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.303210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qw95\" (UniqueName: \"kubernetes.io/projected/b1985221-eb0c-4868-b1f9-55585c1796dc-kube-api-access-9qw95\") pod \"multus-6cp9f\" (UID: \"b1985221-eb0c-4868-b1f9-55585c1796dc\") " pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.303422 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.303400 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsrgw\" (UniqueName: \"kubernetes.io/projected/679d211e-bc5a-4f8c-a0ae-1aae973fb746-kube-api-access-fsrgw\") pod \"iptables-alerter-ttkb5\" (UID: \"679d211e-bc5a-4f8c-a0ae-1aae973fb746\") " pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.303473 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.303451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmwl8\" (UniqueName: \"kubernetes.io/projected/e1b50604-060e-416e-b904-75ab9b75d209-kube-api-access-lmwl8\") pod \"aws-ebs-csi-driver-node-blhdd\" (UID: \"e1b50604-060e-416e-b904-75ab9b75d209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.303550 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.303520 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:43.311553 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.311525 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" event={"ID":"2c187cd16f95e6d697a69aca59443285","Type":"ContainerStarted","Data":"a7dfcdfe6c0fbb5f57e4cd1f6ff1166c9ea4125ea95d90643222ccca385957a5"} Apr 16 18:09:43.395691 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.395657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-run\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.395829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.395704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-host\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.395829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.395765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-run\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.395829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.395781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-host\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.395829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.395817 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxk4v\" (UniqueName: \"kubernetes.io/projected/146da900-7e40-40db-9efe-695d918b8758-kube-api-access-hxk4v\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396005 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.395849 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edb0e160-57f6-4631-88ff-4d22d6b51543-tmp-dir\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.396005 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.395871 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8klrd\" (UniqueName: \"kubernetes.io/projected/d4d545a6-5b19-4165-9bd6-f5c19acf145a-kube-api-access-8klrd\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:43.396095 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396055 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-systemd\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396150 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396092 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-host\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.396150 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396119 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdpld\" (UniqueName: \"kubernetes.io/projected/c4bda706-5aa8-4750-aa51-1fd47724ec81-kube-api-access-zdpld\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.396244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-lib-modules\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-systemd\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396160 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-host\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.396244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396160 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edb0e160-57f6-4631-88ff-4d22d6b51543-tmp-dir\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.396244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396176 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/146da900-7e40-40db-9efe-695d918b8758-etc-tuned\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gw7fc\" (UniqueName: \"kubernetes.io/projected/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-kube-api-access-gw7fc\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.396244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396241 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-cni-binary-copy\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysconfig\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396281 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-lib-modules\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-kubernetes\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysctl-d\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-sys\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-cnibin\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396390 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-kubernetes\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396400 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396429 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysconfig\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysctl-conf\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396473 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-sys\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396475 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-serviceca\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-cnibin\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.396640 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396637 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysctl-d\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396681 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-os-release\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-modprobe-d\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396749 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396777 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb1eecb7-d44f-4eaa-8a18-4e159e76feaa-konnectivity-ca\") pod \"konnectivity-agent-ptp6j\" (UID: \"bb1eecb7-d44f-4eaa-8a18-4e159e76feaa\") " pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396810 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-var-lib-kubelet\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396837 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/146da900-7e40-40db-9efe-695d918b8758-tmp\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396836 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-sysctl-conf\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396873 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-cni-binary-copy\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396885 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-etc-modprobe-d\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396900 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-serviceca\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396875 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q24jq\" (UniqueName: \"kubernetes.io/projected/edb0e160-57f6-4631-88ff-4d22d6b51543-kube-api-access-q24jq\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-system-cni-dir\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396974 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/146da900-7e40-40db-9efe-695d918b8758-var-lib-kubelet\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.396982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/edb0e160-57f6-4631-88ff-4d22d6b51543-hosts-file\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.397014 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.397044 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:43.397324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.397074 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb1eecb7-d44f-4eaa-8a18-4e159e76feaa-agent-certs\") pod \"konnectivity-agent-ptp6j\" (UID: \"bb1eecb7-d44f-4eaa-8a18-4e159e76feaa\") " pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:09:43.398078 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.397117 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-os-release\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.398078 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.397152 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/edb0e160-57f6-4631-88ff-4d22d6b51543-hosts-file\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.398078 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.397206 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-system-cni-dir\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.398078 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.397300 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.398078 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.397401 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb1eecb7-d44f-4eaa-8a18-4e159e76feaa-konnectivity-ca\") pod \"konnectivity-agent-ptp6j\" (UID: \"bb1eecb7-d44f-4eaa-8a18-4e159e76feaa\") " pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:09:43.398078 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.397484 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:09:43.897440722 +0000 UTC m=+3.101792544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.398078 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.397478 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4bda706-5aa8-4750-aa51-1fd47724ec81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.398078 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.398056 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.398544 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.398150 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4bda706-5aa8-4750-aa51-1fd47724ec81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.398628 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.398547 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/146da900-7e40-40db-9efe-695d918b8758-etc-tuned\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.399146 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.399127 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/146da900-7e40-40db-9efe-695d918b8758-tmp\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.400233 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.400215 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb1eecb7-d44f-4eaa-8a18-4e159e76feaa-agent-certs\") pod \"konnectivity-agent-ptp6j\" (UID: \"bb1eecb7-d44f-4eaa-8a18-4e159e76feaa\") " pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:09:43.414112 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.414056 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8klrd\" (UniqueName: \"kubernetes.io/projected/d4d545a6-5b19-4165-9bd6-f5c19acf145a-kube-api-access-8klrd\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:43.417411 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.417383 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:43.417411 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.417408 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:43.417580 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.417420 2567 projected.go:194] Error preparing data for projected volume kube-api-access-wxj77 for pod openshift-network-diagnostics/network-check-target-mq7lv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:43.417580 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.417476 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77 podName:1480d7f1-b475-4fc4-9052-cecacd28ac08 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:43.917459763 +0000 UTC m=+3.121811571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wxj77" (UniqueName: "kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77") pod "network-check-target-mq7lv" (UID: "1480d7f1-b475-4fc4-9052-cecacd28ac08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:43.419234 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.419212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdpld\" (UniqueName: \"kubernetes.io/projected/c4bda706-5aa8-4750-aa51-1fd47724ec81-kube-api-access-zdpld\") pod \"multus-additional-cni-plugins-j5mwl\" (UID: \"c4bda706-5aa8-4750-aa51-1fd47724ec81\") " pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.419945 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.419923 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw7fc\" (UniqueName: \"kubernetes.io/projected/767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92-kube-api-access-gw7fc\") pod \"node-ca-87pfd\" (UID: \"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92\") " pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.420037 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.419995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxk4v\" (UniqueName: \"kubernetes.io/projected/146da900-7e40-40db-9efe-695d918b8758-kube-api-access-hxk4v\") pod \"tuned-nd572\" (UID: \"146da900-7e40-40db-9efe-695d918b8758\") " pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.420150 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.420134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24jq\" (UniqueName: \"kubernetes.io/projected/edb0e160-57f6-4631-88ff-4d22d6b51543-kube-api-access-q24jq\") pod \"node-resolver-lc2jd\" (UID: \"edb0e160-57f6-4631-88ff-4d22d6b51543\") " pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.488716 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.488692 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6cp9f" Apr 16 18:09:43.498590 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.498547 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:09:43.508100 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.508078 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ttkb5" Apr 16 18:09:43.512660 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.512643 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" Apr 16 18:09:43.519195 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.519181 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nd572" Apr 16 18:09:43.525688 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.525670 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lc2jd" Apr 16 18:09:43.532219 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.532202 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-87pfd" Apr 16 18:09:43.538770 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.538752 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:09:43.544287 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.544268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" Apr 16 18:09:43.818659 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.818572 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:43.900915 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:43.900884 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:43.901065 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.900991 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.901065 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:43.901059 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:09:44.901040039 +0000 UTC m=+4.105391856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:43.941514 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.941471 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb1eecb7_d44f_4eaa_8a18_4e159e76feaa.slice/crio-85250bceeb85c7103ab2a49ae4328769fd6f9cad1480f268c7d2f42f9d55e5c8 WatchSource:0}: Error finding container 85250bceeb85c7103ab2a49ae4328769fd6f9cad1480f268c7d2f42f9d55e5c8: Status 404 returned error can't find the container with id 85250bceeb85c7103ab2a49ae4328769fd6f9cad1480f268c7d2f42f9d55e5c8 Apr 16 18:09:43.942285 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.942222 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4bda706_5aa8_4750_aa51_1fd47724ec81.slice/crio-3155e0e26e5788a645674132b3d9ab2862e5b74bd2e263aee65d0c576fb6633b WatchSource:0}: Error finding container 3155e0e26e5788a645674132b3d9ab2862e5b74bd2e263aee65d0c576fb6633b: Status 404 returned error can't find the container with id 3155e0e26e5788a645674132b3d9ab2862e5b74bd2e263aee65d0c576fb6633b Apr 16 18:09:43.943088 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.942989 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb0e160_57f6_4631_88ff_4d22d6b51543.slice/crio-210508a798606b0c2f6d272087142c141a707bccb34c76f2840cc77e5a032215 WatchSource:0}: Error finding container 210508a798606b0c2f6d272087142c141a707bccb34c76f2840cc77e5a032215: Status 404 returned error can't find the container with id 210508a798606b0c2f6d272087142c141a707bccb34c76f2840cc77e5a032215 Apr 16 18:09:43.943871 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.943831 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1985221_eb0c_4868_b1f9_55585c1796dc.slice/crio-6649505812e9fd8a094468515dda1444d6ddbdd181ee7453fb165950929dae56 WatchSource:0}: Error finding container 6649505812e9fd8a094468515dda1444d6ddbdd181ee7453fb165950929dae56: Status 404 returned error can't find the container with id 6649505812e9fd8a094468515dda1444d6ddbdd181ee7453fb165950929dae56 Apr 16 18:09:43.946876 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.946858 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146da900_7e40_40db_9efe_695d918b8758.slice/crio-dedd241d142e8fa83beb83340a07663cee6f9cf0c39dabab29f63e880b16ed8e WatchSource:0}: Error finding container dedd241d142e8fa83beb83340a07663cee6f9cf0c39dabab29f63e880b16ed8e: Status 404 returned error can't find the container with id dedd241d142e8fa83beb83340a07663cee6f9cf0c39dabab29f63e880b16ed8e Apr 16 18:09:43.948543 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.948523 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd668bd_f016_4d96_a5ed_1376b7d9e8ec.slice/crio-2b5bacdac2e9513c545361fcd95a35d6c747be6c51ce4174e2a1e6f3730f6e6a WatchSource:0}: Error finding container 2b5bacdac2e9513c545361fcd95a35d6c747be6c51ce4174e2a1e6f3730f6e6a: Status 404 returned error can't find the container with id 2b5bacdac2e9513c545361fcd95a35d6c747be6c51ce4174e2a1e6f3730f6e6a Apr 16 18:09:43.949238 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.949216 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b50604_060e_416e_b904_75ab9b75d209.slice/crio-be6e235d8d143fdcf0c580a5a63b2dc8878a899f2761703a0c0c5ba71a3401aa WatchSource:0}: Error finding container be6e235d8d143fdcf0c580a5a63b2dc8878a899f2761703a0c0c5ba71a3401aa: Status 404 returned error can't find the container with id be6e235d8d143fdcf0c580a5a63b2dc8878a899f2761703a0c0c5ba71a3401aa Apr 16 18:09:43.950366 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.950303 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod679d211e_bc5a_4f8c_a0ae_1aae973fb746.slice/crio-8e040950b1add70c48309743bd3a93ce2d798911b059efc4944a8642fe8ccc35 WatchSource:0}: Error finding container 8e040950b1add70c48309743bd3a93ce2d798911b059efc4944a8642fe8ccc35: Status 404 returned error can't find the container with id 8e040950b1add70c48309743bd3a93ce2d798911b059efc4944a8642fe8ccc35 Apr 16 18:09:43.951723 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:09:43.951665 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767a8b0d_4ec4_41ae_a78e_8d1bc4c8cc92.slice/crio-4571476e333e3ff9168b06ed2458bc8e38824bbcbb34574c6332475fec34db2b WatchSource:0}: Error finding container 4571476e333e3ff9168b06ed2458bc8e38824bbcbb34574c6332475fec34db2b: Status 404 returned error can't find the container with id 4571476e333e3ff9168b06ed2458bc8e38824bbcbb34574c6332475fec34db2b Apr 16 18:09:44.001177 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.001153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:44.001320 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:44.001302 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:44.001363 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:44.001328 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:44.001363 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:44.001340 2567 projected.go:194] Error preparing data for projected volume kube-api-access-wxj77 for pod openshift-network-diagnostics/network-check-target-mq7lv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:44.001428 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:44.001384 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77 podName:1480d7f1-b475-4fc4-9052-cecacd28ac08 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:45.001370242 +0000 UTC m=+4.205722051 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wxj77" (UniqueName: "kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77") pod "network-check-target-mq7lv" (UID: "1480d7f1-b475-4fc4-9052-cecacd28ac08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:44.230553 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.230471 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:04:42 +0000 UTC" deadline="2027-12-01 06:59:35.411368484 +0000 UTC" Apr 16 18:09:44.230553 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.230507 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14244h49m51.180865071s" Apr 16 18:09:44.306766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.306260 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:44.306766 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:44.306398 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:44.317609 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.317535 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ptp6j" event={"ID":"bb1eecb7-d44f-4eaa-8a18-4e159e76feaa","Type":"ContainerStarted","Data":"85250bceeb85c7103ab2a49ae4328769fd6f9cad1480f268c7d2f42f9d55e5c8"} Apr 16 18:09:44.319192 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.319130 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nd572" event={"ID":"146da900-7e40-40db-9efe-695d918b8758","Type":"ContainerStarted","Data":"dedd241d142e8fa83beb83340a07663cee6f9cf0c39dabab29f63e880b16ed8e"} Apr 16 18:09:44.321903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.321878 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cp9f" event={"ID":"b1985221-eb0c-4868-b1f9-55585c1796dc","Type":"ContainerStarted","Data":"6649505812e9fd8a094468515dda1444d6ddbdd181ee7453fb165950929dae56"} Apr 16 18:09:44.328372 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.327915 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" event={"ID":"2c187cd16f95e6d697a69aca59443285","Type":"ContainerStarted","Data":"e18eb8e07f603d0442c0e2a683dbc1d822b87498394c0c9a1ef45f6d0a538696"} Apr 16 18:09:44.335258 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.335206 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-87pfd" event={"ID":"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92","Type":"ContainerStarted","Data":"4571476e333e3ff9168b06ed2458bc8e38824bbcbb34574c6332475fec34db2b"} Apr 16 18:09:44.339268 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.339246 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ttkb5" event={"ID":"679d211e-bc5a-4f8c-a0ae-1aae973fb746","Type":"ContainerStarted","Data":"8e040950b1add70c48309743bd3a93ce2d798911b059efc4944a8642fe8ccc35"} Apr 16 18:09:44.341265 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.341114 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" event={"ID":"e1b50604-060e-416e-b904-75ab9b75d209","Type":"ContainerStarted","Data":"be6e235d8d143fdcf0c580a5a63b2dc8878a899f2761703a0c0c5ba71a3401aa"} Apr 16 18:09:44.347794 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.347770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"2b5bacdac2e9513c545361fcd95a35d6c747be6c51ce4174e2a1e6f3730f6e6a"} Apr 16 18:09:44.352417 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.352395 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerStarted","Data":"3155e0e26e5788a645674132b3d9ab2862e5b74bd2e263aee65d0c576fb6633b"} Apr 16 18:09:44.355547 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.355524 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lc2jd" event={"ID":"edb0e160-57f6-4631-88ff-4d22d6b51543","Type":"ContainerStarted","Data":"210508a798606b0c2f6d272087142c141a707bccb34c76f2840cc77e5a032215"} Apr 16 18:09:44.906778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:44.906523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:44.906921 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:44.906866 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:44.906981 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:44.906926 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:09:46.906908036 +0000 UTC m=+6.111259855 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:45.007296 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:45.007244 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:45.007438 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:45.007421 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:45.007502 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:45.007449 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:45.007502 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:45.007461 2567 projected.go:194] Error preparing data for projected volume kube-api-access-wxj77 for pod openshift-network-diagnostics/network-check-target-mq7lv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:45.007625 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:45.007516 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77 podName:1480d7f1-b475-4fc4-9052-cecacd28ac08 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:47.007498563 +0000 UTC m=+6.211850375 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wxj77" (UniqueName: "kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77") pod "network-check-target-mq7lv" (UID: "1480d7f1-b475-4fc4-9052-cecacd28ac08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:45.306988 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:45.306954 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:45.307539 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:45.307074 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:09:45.380897 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:45.380866 2567 generic.go:358] "Generic (PLEG): container finished" podID="8e688c1981efc927195def8244abca8f" containerID="5764ddaa479648d2e0b6b8ebfc5a66f4bd0fcf54ae2c7883f921f62b5a5bbd37" exitCode=0 Apr 16 18:09:45.381770 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:45.381503 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" event={"ID":"8e688c1981efc927195def8244abca8f","Type":"ContainerDied","Data":"5764ddaa479648d2e0b6b8ebfc5a66f4bd0fcf54ae2c7883f921f62b5a5bbd37"} Apr 16 18:09:45.396512 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:45.396463 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" podStartSLOduration=3.39644485 podStartE2EDuration="3.39644485s" podCreationTimestamp="2026-04-16 18:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:09:44.341471069 +0000 UTC m=+3.545822896" watchObservedRunningTime="2026-04-16 18:09:45.39644485 +0000 UTC m=+4.600796676" Apr 16 18:09:46.307038 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:46.306483 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:46.307038 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:46.306638 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:46.393996 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:46.393922 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" event={"ID":"8e688c1981efc927195def8244abca8f","Type":"ContainerStarted","Data":"0457dac70d4ca063e31facbe673c6fdc1e13b88c7734ab4e5bb9c9ddedca97b5"} Apr 16 18:09:46.407062 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:46.407016 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" podStartSLOduration=4.40699919 podStartE2EDuration="4.40699919s" podCreationTimestamp="2026-04-16 18:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:09:46.406970643 +0000 UTC m=+5.611322473" watchObservedRunningTime="2026-04-16 18:09:46.40699919 +0000 UTC m=+5.611351013" Apr 16 18:09:46.924016 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:46.923460 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:46.924016 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:46.923617 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:46.924016 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:46.923683 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:09:50.92366561 +0000 UTC m=+10.128017423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:47.024513 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:47.023930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:47.024513 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:47.024077 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:47.024513 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:47.024097 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:47.024513 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:47.024111 2567 projected.go:194] Error preparing data for projected volume kube-api-access-wxj77 for pod openshift-network-diagnostics/network-check-target-mq7lv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:47.024513 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:47.024179 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77 podName:1480d7f1-b475-4fc4-9052-cecacd28ac08 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:51.024150592 +0000 UTC m=+10.228502409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wxj77" (UniqueName: "kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77") pod "network-check-target-mq7lv" (UID: "1480d7f1-b475-4fc4-9052-cecacd28ac08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:47.306666 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:47.306590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:47.306814 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:47.306718 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:09:48.306187 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:48.306146 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:48.306612 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:48.306290 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:49.306835 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:49.306802 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:49.307292 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:49.306922 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:09:50.306366 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:50.306334 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:50.306537 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:50.306479 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:50.954810 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:50.954774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:50.955270 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:50.954935 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:50.955270 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:50.955016 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:09:58.954995619 +0000 UTC m=+18.159347427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:51.055994 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:51.055959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:51.056231 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:51.056169 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:51.056231 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:51.056191 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:51.056231 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:51.056202 2567 projected.go:194] Error preparing data for projected volume kube-api-access-wxj77 for pod openshift-network-diagnostics/network-check-target-mq7lv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:51.056491 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:51.056259 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77 podName:1480d7f1-b475-4fc4-9052-cecacd28ac08 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:59.056244453 +0000 UTC m=+18.260596258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wxj77" (UniqueName: "kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77") pod "network-check-target-mq7lv" (UID: "1480d7f1-b475-4fc4-9052-cecacd28ac08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:51.308670 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:51.308200 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:51.308670 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:51.308307 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:09:52.306469 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:52.306431 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:52.307052 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:52.306593 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:53.306069 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:53.305989 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:53.306231 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:53.306098 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:09:54.306175 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:54.306137 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:54.306618 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:54.306264 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:55.306070 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:55.306035 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:55.306251 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:55.306165 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:09:56.306606 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:56.306577 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:56.307243 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:56.306709 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:57.306968 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:57.306933 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:57.307440 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:57.307056 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:09:58.306295 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:58.306257 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:58.306519 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:58.306405 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:09:59.018673 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:59.018635 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:09:59.019094 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:59.018791 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:59.019094 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:59.018859 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.018840256 +0000 UTC m=+34.223192075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:59.119939 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:59.119900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:59.120120 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:59.120060 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:59.120120 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:59.120076 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:59.120120 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:59.120089 2567 projected.go:194] Error preparing data for projected volume kube-api-access-wxj77 for pod openshift-network-diagnostics/network-check-target-mq7lv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:59.120271 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:59.120151 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77 podName:1480d7f1-b475-4fc4-9052-cecacd28ac08 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.120133669 +0000 UTC m=+34.324485482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wxj77" (UniqueName: "kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77") pod "network-check-target-mq7lv" (UID: "1480d7f1-b475-4fc4-9052-cecacd28ac08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:59.306698 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:09:59.306619 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:09:59.306864 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:09:59.306740 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:10:00.305960 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:00.305928 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:00.306405 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:00.306067 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:10:01.309774 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.308270 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:01.309774 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:01.308594 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:10:01.425219 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.424909 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-87pfd" event={"ID":"767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92","Type":"ContainerStarted","Data":"1e164de3578e4bbf79af758d1df9bb00e349dc293b8215fd65d6f340abcfd4aa"} Apr 16 18:10:01.427631 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.427593 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"358f9a0e94e02142ba58dc039029d15222820b87689267de25ac665fe0665e9c"} Apr 16 18:10:01.429806 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.429623 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerStarted","Data":"dbffd67d35fd4a1ae2b5b7fb70a2b19f21424833ad01ca183edd54888a39e2a8"} Apr 16 18:10:01.434023 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.434000 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nd572" event={"ID":"146da900-7e40-40db-9efe-695d918b8758","Type":"ContainerStarted","Data":"9e391c5fc903b1c10a9e865176a7f46f458577448289f44d649994cb8bad8144"} Apr 16 18:10:01.435256 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.435236 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cp9f" event={"ID":"b1985221-eb0c-4868-b1f9-55585c1796dc","Type":"ContainerStarted","Data":"0362c8b0f0b913b377217235a460f69dc894fb3b931a550298092244fd69917c"} Apr 16 18:10:01.440745 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.440337 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-87pfd" podStartSLOduration=11.28839339 podStartE2EDuration="20.440322944s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.954499087 +0000 UTC m=+3.158850898" lastFinishedPulling="2026-04-16 18:09:53.106428632 +0000 UTC m=+12.310780452" observedRunningTime="2026-04-16 18:10:01.440011788 +0000 UTC m=+20.644363640" watchObservedRunningTime="2026-04-16 18:10:01.440322944 +0000 UTC m=+20.644674773" Apr 16 18:10:01.456914 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.456516 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nd572" podStartSLOduration=3.271504706 podStartE2EDuration="20.456498672s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.948655946 +0000 UTC m=+3.153007750" lastFinishedPulling="2026-04-16 18:10:01.133649903 +0000 UTC m=+20.338001716" observedRunningTime="2026-04-16 18:10:01.455836203 +0000 UTC m=+20.660188030" watchObservedRunningTime="2026-04-16 18:10:01.456498672 +0000 UTC m=+20.660850501" Apr 16 18:10:01.474490 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.474449 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ptp6j" podStartSLOduration=3.286267861 podStartE2EDuration="20.474435709s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.943895534 +0000 UTC m=+3.148247341" lastFinishedPulling="2026-04-16 18:10:01.132063371 +0000 UTC m=+20.336415189" observedRunningTime="2026-04-16 18:10:01.473987654 +0000 UTC m=+20.678339488" watchObservedRunningTime="2026-04-16 18:10:01.474435709 +0000 UTC m=+20.678787536" Apr 16 18:10:01.513460 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:01.513402 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6cp9f" podStartSLOduration=3.178450806 podStartE2EDuration="20.513383483s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.945724614 +0000 UTC m=+3.150076434" lastFinishedPulling="2026-04-16 18:10:01.280657293 +0000 UTC m=+20.485009111" observedRunningTime="2026-04-16 18:10:01.513290938 +0000 UTC m=+20.717642764" watchObservedRunningTime="2026-04-16 18:10:01.513383483 +0000 UTC m=+20.717735310" Apr 16 18:10:02.306365 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.306178 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:02.306511 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:02.306406 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:10:02.438382 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.438347 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" event={"ID":"e1b50604-060e-416e-b904-75ab9b75d209","Type":"ContainerStarted","Data":"160ddf0e6129b722e4f638cfffe831e1e984ad9fae704243cb2628c65df8e1e7"} Apr 16 18:10:02.440638 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.440613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"07261a8317d744884ebc92b723d1c605b8f54b181730bac2c5fe4893477adb09"} Apr 16 18:10:02.440638 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.440643 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"f33e7675a68c35bf43bae0f19b8584ab66602e7fd81e17b44db12b1e87b8c0f4"} Apr 16 18:10:02.440800 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.440652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"c2113a3a7556fa6ccbe10852e2e77082a854dd3eafa648dc5ec0636f21155c6d"} Apr 16 18:10:02.440800 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.440660 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"8a23f0a8a4019af8b24e669518040182ae2fd32a3447e92e3a6ab2713ed07625"} Apr 16 18:10:02.440800 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.440672 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"816cf3277537f55ff2ef128df6eec4e9d64a60711ebdf2206c68ef33559b1a12"} Apr 16 18:10:02.441755 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.441735 2567 generic.go:358] "Generic (PLEG): container finished" podID="c4bda706-5aa8-4750-aa51-1fd47724ec81" containerID="dbffd67d35fd4a1ae2b5b7fb70a2b19f21424833ad01ca183edd54888a39e2a8" exitCode=0 Apr 16 18:10:02.441826 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.441790 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerDied","Data":"dbffd67d35fd4a1ae2b5b7fb70a2b19f21424833ad01ca183edd54888a39e2a8"} Apr 16 18:10:02.444735 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.444712 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lc2jd" event={"ID":"edb0e160-57f6-4631-88ff-4d22d6b51543","Type":"ContainerStarted","Data":"1dddf22789462a3bac063c539a20dc9817b5fdd4d3a5cccd1f82109bac23dd0e"} Apr 16 18:10:02.446061 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.446039 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ptp6j" event={"ID":"bb1eecb7-d44f-4eaa-8a18-4e159e76feaa","Type":"ContainerStarted","Data":"a447ed98b7b348f19127b17b273f49927e9d0b1866bfe7f2587d8eb09cb5a223"} Apr 16 18:10:02.483732 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.483607 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lc2jd" podStartSLOduration=4.296943765 podStartE2EDuration="21.483594337s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.945362877 +0000 UTC m=+3.149714682" lastFinishedPulling="2026-04-16 18:10:01.132013437 +0000 UTC m=+20.336365254" observedRunningTime="2026-04-16 18:10:02.483395306 +0000 UTC m=+21.687747143" watchObservedRunningTime="2026-04-16 18:10:02.483594337 +0000 UTC m=+21.687946163" Apr 16 18:10:02.594598 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:02.594551 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:10:03.244936 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:03.244817 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:10:02.594593833Z","UUID":"3d70cfb4-1dea-4adb-afe1-8b6618b39065","Handler":null,"Name":"","Endpoint":""} Apr 16 18:10:03.247769 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:03.247746 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:10:03.247912 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:03.247778 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:10:03.306981 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:03.306953 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:03.307135 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:03.307079 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:10:03.449710 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:03.449651 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ttkb5" event={"ID":"679d211e-bc5a-4f8c-a0ae-1aae973fb746","Type":"ContainerStarted","Data":"154b87ffd3c3e9227888555ba2eb7878c21c35d6802eba3038e7c162d7cbaafe"} Apr 16 18:10:03.451699 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:03.451673 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" event={"ID":"e1b50604-060e-416e-b904-75ab9b75d209","Type":"ContainerStarted","Data":"0100328e6af51896d5a25ef70e622b2b85491f9ed7ca79f17905767cda884460"} Apr 16 18:10:03.465457 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:03.465396 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ttkb5" podStartSLOduration=5.287870487 podStartE2EDuration="22.465378371s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.954506151 +0000 UTC m=+3.158857969" lastFinishedPulling="2026-04-16 18:10:01.132014035 +0000 UTC m=+20.336365853" observedRunningTime="2026-04-16 18:10:03.464982753 +0000 UTC m=+22.669334579" watchObservedRunningTime="2026-04-16 18:10:03.465378371 +0000 UTC m=+22.669730197" Apr 16 18:10:04.306113 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:04.306085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:04.306286 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:04.306192 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:10:04.455920 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:04.455870 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" event={"ID":"e1b50604-060e-416e-b904-75ab9b75d209","Type":"ContainerStarted","Data":"1dfe1c241df832a0bd6a9648d63b1f6a9694d816266aa7ebb803ca30f80f8456"} Apr 16 18:10:04.459412 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:04.459387 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"504e9f36cdf10f403530663ea88f8d3576ef5295542c28d625729abb5860b3f7"} Apr 16 18:10:04.480639 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:04.480588 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-blhdd" podStartSLOduration=4.045834508 podStartE2EDuration="23.480572255s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.951204129 +0000 UTC m=+3.155555937" lastFinishedPulling="2026-04-16 18:10:03.385941876 +0000 UTC m=+22.590293684" observedRunningTime="2026-04-16 18:10:04.4804701 +0000 UTC m=+23.684821927" watchObservedRunningTime="2026-04-16 18:10:04.480572255 +0000 UTC m=+23.684924085" Apr 16 18:10:05.306529 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:05.306495 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:05.306727 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:05.306642 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:10:06.033334 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.033299 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:10:06.033994 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.033975 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:10:06.306709 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.306682 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:06.306862 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:06.306787 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:10:06.467365 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.467089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" event={"ID":"9bd668bd-f016-4d96-a5ed-1376b7d9e8ec","Type":"ContainerStarted","Data":"7b291ec746e91ccd678feb7389432ffdc1184b0350d21426efa320ceaa9ddd46"} Apr 16 18:10:06.467774 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.467473 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:10:06.467839 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.467826 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:10:06.467983 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.467849 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:10:06.467983 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.467860 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:10:06.468427 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.468063 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ptp6j" Apr 16 18:10:06.487011 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.486629 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:10:06.488109 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.487942 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:10:06.495834 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:06.495606 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" podStartSLOduration=8.275859111 podStartE2EDuration="25.495590077s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.951203046 +0000 UTC m=+3.155554860" lastFinishedPulling="2026-04-16 18:10:01.170934019 +0000 UTC m=+20.375285826" observedRunningTime="2026-04-16 18:10:06.495404397 +0000 UTC m=+25.699756225" watchObservedRunningTime="2026-04-16 18:10:06.495590077 +0000 UTC m=+25.699941894" Apr 16 18:10:07.306045 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:07.306017 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:07.306540 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:07.306103 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:10:07.470107 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:07.470079 2567 generic.go:358] "Generic (PLEG): container finished" podID="c4bda706-5aa8-4750-aa51-1fd47724ec81" containerID="5b21cd62f090e261bd0c4b8d7174ed575d906cbd0a7d769183f4be9aa338130c" exitCode=0 Apr 16 18:10:07.470225 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:07.470156 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerDied","Data":"5b21cd62f090e261bd0c4b8d7174ed575d906cbd0a7d769183f4be9aa338130c"} Apr 16 18:10:08.307133 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:08.306938 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:08.307442 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:08.307173 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:10:08.361407 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:08.361377 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mq7lv"] Apr 16 18:10:08.361514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:08.361478 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:08.361622 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:08.361547 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:10:08.363992 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:08.363970 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bbpzd"] Apr 16 18:10:08.473326 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:08.473296 2567 generic.go:358] "Generic (PLEG): container finished" podID="c4bda706-5aa8-4750-aa51-1fd47724ec81" containerID="009596b08b1e4efe0800a81e82148ddbed365ce7a037c53521d0d9f870e7bec3" exitCode=0 Apr 16 18:10:08.473454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:08.473325 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerDied","Data":"009596b08b1e4efe0800a81e82148ddbed365ce7a037c53521d0d9f870e7bec3"} Apr 16 18:10:08.473454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:08.473382 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:08.473619 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:08.473602 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:10:09.476787 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:09.476716 2567 generic.go:358] "Generic (PLEG): container finished" podID="c4bda706-5aa8-4750-aa51-1fd47724ec81" containerID="f46e7f63531a79b3decb655bc2f3da1f68dfe09031d0254c1b1788a898517264" exitCode=0 Apr 16 18:10:09.476787 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:09.476766 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerDied","Data":"f46e7f63531a79b3decb655bc2f3da1f68dfe09031d0254c1b1788a898517264"} Apr 16 18:10:10.306987 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:10.306912 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:10.307124 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:10.307040 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:10:10.307124 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:10.306920 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:10.307239 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:10.307187 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:10:12.306226 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:12.306149 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:12.306895 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:12.306154 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:12.306895 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:12.306261 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mq7lv" podUID="1480d7f1-b475-4fc4-9052-cecacd28ac08" Apr 16 18:10:12.306895 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:12.306350 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:10:14.166211 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.166185 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeReady" Apr 16 18:10:14.166704 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.166363 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:10:14.218002 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.217963 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lp6zw"] Apr 16 18:10:14.245690 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.245660 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vvblr"] Apr 16 18:10:14.245869 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.245844 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.248973 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.248767 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:10:14.248973 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.248818 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:10:14.248973 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.248876 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bnqh9\"" Apr 16 18:10:14.263968 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.263943 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vvblr"] Apr 16 18:10:14.263968 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.263972 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lp6zw"] Apr 16 18:10:14.264141 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.264073 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:14.266772 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.266544 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:10:14.266772 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.266549 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:10:14.266772 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.266616 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:10:14.266772 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.266629 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ffhs9\"" Apr 16 18:10:14.306665 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.306642 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:14.306665 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.306659 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:14.309123 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.309100 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:14.309230 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.309151 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:14.309230 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.309108 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kjmpp\"" Apr 16 18:10:14.309347 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.309302 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:14.309347 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.309328 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bnhf5\"" Apr 16 18:10:14.336164 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.336137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-config-volume\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.336258 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.336182 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf8s8\" (UniqueName: \"kubernetes.io/projected/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-kube-api-access-jf8s8\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.336258 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.336210 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.336258 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.336247 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-tmp-dir\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.437634 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.437540 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-tmp-dir\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.437815 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.437634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-config-volume\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.437815 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.437666 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxznx\" (UniqueName: \"kubernetes.io/projected/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-kube-api-access-gxznx\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:14.437815 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.437713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf8s8\" (UniqueName: \"kubernetes.io/projected/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-kube-api-access-jf8s8\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.437815 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.437745 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.437815 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.437774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:14.438058 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:14.437889 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:14.438058 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.437932 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-tmp-dir\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.438058 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:14.437951 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls podName:05e20dd4-40f5-487a-b14b-4e0d5b971aeb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:14.937934114 +0000 UTC m=+34.142285927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls") pod "dns-default-lp6zw" (UID: "05e20dd4-40f5-487a-b14b-4e0d5b971aeb") : secret "dns-default-metrics-tls" not found Apr 16 18:10:14.438298 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.438277 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-config-volume\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.450167 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.450001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf8s8\" (UniqueName: \"kubernetes.io/projected/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-kube-api-access-jf8s8\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.538228 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.538200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:14.538394 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.538283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxznx\" (UniqueName: \"kubernetes.io/projected/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-kube-api-access-gxznx\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:14.538394 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:14.538364 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:14.538482 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:14.538437 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert podName:040d516b-7ed2-4aa4-b0b4-bc6131bac5cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.038416566 +0000 UTC m=+34.242768373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert") pod "ingress-canary-vvblr" (UID: "040d516b-7ed2-4aa4-b0b4-bc6131bac5cf") : secret "canary-serving-cert" not found Apr 16 18:10:14.550202 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.550178 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxznx\" (UniqueName: \"kubernetes.io/projected/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-kube-api-access-gxznx\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:14.940403 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:14.940357 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:14.940713 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:14.940511 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:14.940713 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:14.940611 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls podName:05e20dd4-40f5-487a-b14b-4e0d5b971aeb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.940589541 +0000 UTC m=+35.144941357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls") pod "dns-default-lp6zw" (UID: "05e20dd4-40f5-487a-b14b-4e0d5b971aeb") : secret "dns-default-metrics-tls" not found Apr 16 18:10:15.040934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:15.040906 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:15.041061 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:15.040971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:15.041061 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:15.041053 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:15.041133 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:15.041108 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert podName:040d516b-7ed2-4aa4-b0b4-bc6131bac5cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:16.041095874 +0000 UTC m=+35.245447678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert") pod "ingress-canary-vvblr" (UID: "040d516b-7ed2-4aa4-b0b4-bc6131bac5cf") : secret "canary-serving-cert" not found Apr 16 18:10:15.041183 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:15.041052 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:10:15.041183 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:15.041166 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.041154878 +0000 UTC m=+66.245506688 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : secret "metrics-daemon-secret" not found Apr 16 18:10:15.141937 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:15.141905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:15.144598 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:15.144576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxj77\" (UniqueName: \"kubernetes.io/projected/1480d7f1-b475-4fc4-9052-cecacd28ac08-kube-api-access-wxj77\") pod \"network-check-target-mq7lv\" (UID: \"1480d7f1-b475-4fc4-9052-cecacd28ac08\") " pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:15.222442 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:15.222395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:15.382497 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:15.382451 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mq7lv"] Apr 16 18:10:15.386963 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:10:15.386937 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1480d7f1_b475_4fc4_9052_cecacd28ac08.slice/crio-b055e0c4f7c4265714cb13774e2db92a637e471dd90c2bf2baf320b60deb3834 WatchSource:0}: Error finding container b055e0c4f7c4265714cb13774e2db92a637e471dd90c2bf2baf320b60deb3834: Status 404 returned error can't find the container with id b055e0c4f7c4265714cb13774e2db92a637e471dd90c2bf2baf320b60deb3834 Apr 16 18:10:15.489261 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:15.489198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mq7lv" event={"ID":"1480d7f1-b475-4fc4-9052-cecacd28ac08","Type":"ContainerStarted","Data":"b055e0c4f7c4265714cb13774e2db92a637e471dd90c2bf2baf320b60deb3834"} Apr 16 18:10:15.948406 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:15.948369 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:15.948584 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:15.948523 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:15.948646 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:15.948613 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls podName:05e20dd4-40f5-487a-b14b-4e0d5b971aeb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.948587892 +0000 UTC m=+37.152939711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls") pod "dns-default-lp6zw" (UID: "05e20dd4-40f5-487a-b14b-4e0d5b971aeb") : secret "dns-default-metrics-tls" not found Apr 16 18:10:16.049130 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:16.049102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:16.049293 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:16.049241 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:16.049338 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:16.049320 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert podName:040d516b-7ed2-4aa4-b0b4-bc6131bac5cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:18.049296544 +0000 UTC m=+37.253648352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert") pod "ingress-canary-vvblr" (UID: "040d516b-7ed2-4aa4-b0b4-bc6131bac5cf") : secret "canary-serving-cert" not found Apr 16 18:10:16.494220 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:16.494182 2567 generic.go:358] "Generic (PLEG): container finished" podID="c4bda706-5aa8-4750-aa51-1fd47724ec81" containerID="afe2346a4d5d0a7f9a4c1b1e24f25c368d69f022c87e449f497733499c588099" exitCode=0 Apr 16 18:10:16.494610 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:16.494239 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerDied","Data":"afe2346a4d5d0a7f9a4c1b1e24f25c368d69f022c87e449f497733499c588099"} Apr 16 18:10:17.499316 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:17.499104 2567 generic.go:358] "Generic (PLEG): container finished" podID="c4bda706-5aa8-4750-aa51-1fd47724ec81" containerID="ab313a2160e78ee039477918e8bed254bf0cc63838f695b30e7371e557bed934" exitCode=0 Apr 16 18:10:17.499773 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:17.499192 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerDied","Data":"ab313a2160e78ee039477918e8bed254bf0cc63838f695b30e7371e557bed934"} Apr 16 18:10:17.964034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:17.964001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:17.964191 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:17.964119 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:17.964191 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:17.964180 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls podName:05e20dd4-40f5-487a-b14b-4e0d5b971aeb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:21.964165111 +0000 UTC m=+41.168516922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls") pod "dns-default-lp6zw" (UID: "05e20dd4-40f5-487a-b14b-4e0d5b971aeb") : secret "dns-default-metrics-tls" not found Apr 16 18:10:18.065269 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:18.065235 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:18.065419 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:18.065338 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:18.065419 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:18.065387 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert podName:040d516b-7ed2-4aa4-b0b4-bc6131bac5cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:22.065374548 +0000 UTC m=+41.269726352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert") pod "ingress-canary-vvblr" (UID: "040d516b-7ed2-4aa4-b0b4-bc6131bac5cf") : secret "canary-serving-cert" not found Apr 16 18:10:18.504432 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:18.504397 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" event={"ID":"c4bda706-5aa8-4750-aa51-1fd47724ec81","Type":"ContainerStarted","Data":"ac6e79242f150599e3a4d9b0b87c70ec1785bce091547bab39cec611df9d399c"} Apr 16 18:10:18.529833 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:18.529776 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j5mwl" podStartSLOduration=6.055028579 podStartE2EDuration="37.52976244s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:09:43.944753522 +0000 UTC m=+3.149105325" lastFinishedPulling="2026-04-16 18:10:15.419487382 +0000 UTC m=+34.623839186" observedRunningTime="2026-04-16 18:10:18.528306978 +0000 UTC m=+37.732658798" watchObservedRunningTime="2026-04-16 18:10:18.52976244 +0000 UTC m=+37.734114263" Apr 16 18:10:19.507789 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:19.507755 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mq7lv" event={"ID":"1480d7f1-b475-4fc4-9052-cecacd28ac08","Type":"ContainerStarted","Data":"6e7fa3833d1502714a95213e963c24ed2ebc7b8cd4b59af4057695773c460b37"} Apr 16 18:10:19.508147 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:19.507994 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:19.528895 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:19.528852 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mq7lv" podStartSLOduration=35.499430307 podStartE2EDuration="38.52884127s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:15.397106601 +0000 UTC m=+34.601458404" lastFinishedPulling="2026-04-16 18:10:18.426517563 +0000 UTC m=+37.630869367" observedRunningTime="2026-04-16 18:10:19.527945794 +0000 UTC m=+38.732297620" watchObservedRunningTime="2026-04-16 18:10:19.52884127 +0000 UTC m=+38.733193093" Apr 16 18:10:21.992072 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:21.992033 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:21.992437 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:21.992188 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:21.992437 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:21.992265 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls podName:05e20dd4-40f5-487a-b14b-4e0d5b971aeb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:29.992248119 +0000 UTC m=+49.196599924 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls") pod "dns-default-lp6zw" (UID: "05e20dd4-40f5-487a-b14b-4e0d5b971aeb") : secret "dns-default-metrics-tls" not found Apr 16 18:10:22.093197 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:22.093168 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:22.093323 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:22.093302 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:22.093367 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:22.093355 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert podName:040d516b-7ed2-4aa4-b0b4-bc6131bac5cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:30.093340987 +0000 UTC m=+49.297692791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert") pod "ingress-canary-vvblr" (UID: "040d516b-7ed2-4aa4-b0b4-bc6131bac5cf") : secret "canary-serving-cert" not found Apr 16 18:10:30.043277 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:30.043243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:30.043648 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:30.043361 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:30.043648 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:30.043417 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls podName:05e20dd4-40f5-487a-b14b-4e0d5b971aeb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.04340436 +0000 UTC m=+65.247756164 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls") pod "dns-default-lp6zw" (UID: "05e20dd4-40f5-487a-b14b-4e0d5b971aeb") : secret "dns-default-metrics-tls" not found Apr 16 18:10:30.143975 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:30.143933 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:30.144145 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:30.144110 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:30.144194 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:30.144170 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert podName:040d516b-7ed2-4aa4-b0b4-bc6131bac5cf nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.14415541 +0000 UTC m=+65.348507214 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert") pod "ingress-canary-vvblr" (UID: "040d516b-7ed2-4aa4-b0b4-bc6131bac5cf") : secret "canary-serving-cert" not found Apr 16 18:10:38.504369 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:38.504330 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-htm65" Apr 16 18:10:46.046034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:46.045997 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:10:46.046425 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:46.046134 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:46.046425 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:46.046208 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls podName:05e20dd4-40f5-487a-b14b-4e0d5b971aeb nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.046192021 +0000 UTC m=+97.250543826 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls") pod "dns-default-lp6zw" (UID: "05e20dd4-40f5-487a-b14b-4e0d5b971aeb") : secret "dns-default-metrics-tls" not found Apr 16 18:10:46.146758 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:46.146728 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:10:46.146888 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:46.146869 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:46.146941 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:46.146931 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert podName:040d516b-7ed2-4aa4-b0b4-bc6131bac5cf nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.146916324 +0000 UTC m=+97.351268129 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert") pod "ingress-canary-vvblr" (UID: "040d516b-7ed2-4aa4-b0b4-bc6131bac5cf") : secret "canary-serving-cert" not found Apr 16 18:10:47.052458 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:47.052416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:10:47.052886 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:47.052538 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:10:47.052886 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:10:47.052604 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:11:51.052589437 +0000 UTC m=+130.256941241 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : secret "metrics-daemon-secret" not found Apr 16 18:10:50.512468 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:50.512439 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mq7lv" Apr 16 18:10:51.943099 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.943069 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl"] Apr 16 18:10:51.948160 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.948142 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:51.950229 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.950202 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:10:51.950446 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.950430 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:10:51.950529 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.950509 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:10:51.951238 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.951216 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:10:51.951344 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.951242 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:10:51.951344 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.951277 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:10:51.951344 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.951216 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:10:51.959898 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.959873 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl"] Apr 16 18:10:51.984597 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.984547 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:51.984766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.984603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-ca\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:51.984766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.984639 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tvkb\" (UniqueName: \"kubernetes.io/projected/20183c7a-22bd-4fbc-b9ae-34b39f843753-kube-api-access-9tvkb\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:51.984766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.984705 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:51.984766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.984724 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/20183c7a-22bd-4fbc-b9ae-34b39f843753-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:51.984766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:51.984756 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-hub\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.085279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.085233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.085279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.085279 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/20183c7a-22bd-4fbc-b9ae-34b39f843753-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.085525 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.085325 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-hub\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.085525 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.085368 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.085525 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.085444 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-ca\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.085525 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.085479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tvkb\" (UniqueName: \"kubernetes.io/projected/20183c7a-22bd-4fbc-b9ae-34b39f843753-kube-api-access-9tvkb\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.086153 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.086126 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/20183c7a-22bd-4fbc-b9ae-34b39f843753-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.089091 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.089061 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-ca\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.089091 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.089085 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-hub\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.089218 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.089186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.089284 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.089269 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/20183c7a-22bd-4fbc-b9ae-34b39f843753-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.094464 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.094438 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tvkb\" (UniqueName: \"kubernetes.io/projected/20183c7a-22bd-4fbc-b9ae-34b39f843753-kube-api-access-9tvkb\") pod \"cluster-proxy-proxy-agent-78f8db7b48-qb5kl\" (UID: \"20183c7a-22bd-4fbc-b9ae-34b39f843753\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.268787 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.268699 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:10:52.419136 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.419105 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl"] Apr 16 18:10:52.423030 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:10:52.422986 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20183c7a_22bd_4fbc_b9ae_34b39f843753.slice/crio-2f55810b4b810e602fb7e83f825ae1f62ddbba3f20ad2fa9156770ef91bd3315 WatchSource:0}: Error finding container 2f55810b4b810e602fb7e83f825ae1f62ddbba3f20ad2fa9156770ef91bd3315: Status 404 returned error can't find the container with id 2f55810b4b810e602fb7e83f825ae1f62ddbba3f20ad2fa9156770ef91bd3315 Apr 16 18:10:52.568237 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:52.568142 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" event={"ID":"20183c7a-22bd-4fbc-b9ae-34b39f843753","Type":"ContainerStarted","Data":"2f55810b4b810e602fb7e83f825ae1f62ddbba3f20ad2fa9156770ef91bd3315"} Apr 16 18:10:55.575927 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:55.575885 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" event={"ID":"20183c7a-22bd-4fbc-b9ae-34b39f843753","Type":"ContainerStarted","Data":"a1ca1560d08f75fa573b88426e26a0665bf9e67339efc422789726c90f537e5c"} Apr 16 18:10:57.581300 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:57.581266 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" event={"ID":"20183c7a-22bd-4fbc-b9ae-34b39f843753","Type":"ContainerStarted","Data":"017385975edaa6ebb4fc0dd42012647aaefaf7b31f1db94981639980ff867649"} Apr 16 18:10:57.581300 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:57.581301 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" event={"ID":"20183c7a-22bd-4fbc-b9ae-34b39f843753","Type":"ContainerStarted","Data":"16fc49635d7dc441ff970723445d8c01de3a676b83d0fe1eabf954ac644f7320"} Apr 16 18:10:57.602597 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:10:57.602528 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" podStartSLOduration=2.020167424 podStartE2EDuration="6.602510817s" podCreationTimestamp="2026-04-16 18:10:51 +0000 UTC" firstStartedPulling="2026-04-16 18:10:52.424618875 +0000 UTC m=+71.628970683" lastFinishedPulling="2026-04-16 18:10:57.006962272 +0000 UTC m=+76.211314076" observedRunningTime="2026-04-16 18:10:57.601179708 +0000 UTC m=+76.805531568" watchObservedRunningTime="2026-04-16 18:10:57.602510817 +0000 UTC m=+76.806862676" Apr 16 18:11:18.071188 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:18.071153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:11:18.071611 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:18.071260 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:18.071611 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:18.071313 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls podName:05e20dd4-40f5-487a-b14b-4e0d5b971aeb nodeName:}" failed. No retries permitted until 2026-04-16 18:12:22.071300362 +0000 UTC m=+161.275652165 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls") pod "dns-default-lp6zw" (UID: "05e20dd4-40f5-487a-b14b-4e0d5b971aeb") : secret "dns-default-metrics-tls" not found Apr 16 18:11:18.171826 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:18.171751 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:11:18.171947 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:18.171843 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:18.171947 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:18.171888 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert podName:040d516b-7ed2-4aa4-b0b4-bc6131bac5cf nodeName:}" failed. No retries permitted until 2026-04-16 18:12:22.171875376 +0000 UTC m=+161.376227181 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert") pod "ingress-canary-vvblr" (UID: "040d516b-7ed2-4aa4-b0b4-bc6131bac5cf") : secret "canary-serving-cert" not found Apr 16 18:11:51.092196 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:51.092153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:11:51.092709 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:51.092292 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:11:51.092709 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:51.092358 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs podName:d4d545a6-5b19-4165-9bd6-f5c19acf145a nodeName:}" failed. No retries permitted until 2026-04-16 18:13:53.092344113 +0000 UTC m=+252.296695917 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs") pod "network-metrics-daemon-bbpzd" (UID: "d4d545a6-5b19-4165-9bd6-f5c19acf145a") : secret "metrics-daemon-secret" not found Apr 16 18:11:55.881259 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:55.881218 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-695bcd9585-lcxln"] Apr 16 18:11:55.883094 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:55.883078 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:55.885278 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:55.885251 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:11:55.885408 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:55.885340 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:11:55.885788 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:55.885771 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tt7tt\"" Apr 16 18:11:55.886005 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:55.885992 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:11:55.893715 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:55.893683 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:11:55.899152 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:55.899124 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-695bcd9585-lcxln"] Apr 16 18:11:56.027452 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.027412 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llhsj\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-kube-api-access-llhsj\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.027662 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.027472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.027662 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.027545 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-certificates\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.027662 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.027622 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-trusted-ca\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.027662 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.027648 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-bound-sa-token\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.027795 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.027678 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-image-registry-private-configuration\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.027795 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.027726 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-installation-pull-secrets\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.027795 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.027747 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0446bd1-52b0-4692-ad20-21395d1e30ad-ca-trust-extracted\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.128292 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128239 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.128410 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128307 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-certificates\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.128410 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128345 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-trusted-ca\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.128410 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128371 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-bound-sa-token\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.128410 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:56.128399 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:56.128599 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:56.128415 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695bcd9585-lcxln: secret "image-registry-tls" not found Apr 16 18:11:56.128599 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128415 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-image-registry-private-configuration\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.128599 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-installation-pull-secrets\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.128599 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:56.128487 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls podName:f0446bd1-52b0-4692-ad20-21395d1e30ad nodeName:}" failed. No retries permitted until 2026-04-16 18:11:56.628468004 +0000 UTC m=+135.832819825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls") pod "image-registry-695bcd9585-lcxln" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad") : secret "image-registry-tls" not found Apr 16 18:11:56.128599 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0446bd1-52b0-4692-ad20-21395d1e30ad-ca-trust-extracted\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.128599 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llhsj\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-kube-api-access-llhsj\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.129003 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.128984 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0446bd1-52b0-4692-ad20-21395d1e30ad-ca-trust-extracted\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.129073 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.129052 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-certificates\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.129899 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.129879 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-trusted-ca\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.130925 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.130905 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-installation-pull-secrets\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.131020 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.130958 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-image-registry-private-configuration\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.138418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.138356 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-bound-sa-token\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.138512 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.138422 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llhsj\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-kube-api-access-llhsj\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.632611 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:56.632546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:56.632874 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:56.632705 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:56.632874 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:56.632727 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695bcd9585-lcxln: secret "image-registry-tls" not found Apr 16 18:11:56.632874 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:56.632796 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls podName:f0446bd1-52b0-4692-ad20-21395d1e30ad nodeName:}" failed. No retries permitted until 2026-04-16 18:11:57.632779586 +0000 UTC m=+136.837131394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls") pod "image-registry-695bcd9585-lcxln" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad") : secret "image-registry-tls" not found Apr 16 18:11:57.640878 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:57.640837 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:57.641288 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:57.640954 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:57.641288 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:57.640966 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695bcd9585-lcxln: secret "image-registry-tls" not found Apr 16 18:11:57.641288 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:57.641025 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls podName:f0446bd1-52b0-4692-ad20-21395d1e30ad nodeName:}" failed. No retries permitted until 2026-04-16 18:11:59.641010475 +0000 UTC m=+138.845362279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls") pod "image-registry-695bcd9585-lcxln" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad") : secret "image-registry-tls" not found Apr 16 18:11:59.653794 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:11:59.653756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:11:59.654293 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:59.653936 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:59.654293 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:59.653963 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695bcd9585-lcxln: secret "image-registry-tls" not found Apr 16 18:11:59.654293 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:11:59.654033 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls podName:f0446bd1-52b0-4692-ad20-21395d1e30ad nodeName:}" failed. No retries permitted until 2026-04-16 18:12:03.654012702 +0000 UTC m=+142.858364511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls") pod "image-registry-695bcd9585-lcxln" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad") : secret "image-registry-tls" not found Apr 16 18:12:03.062396 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:03.062364 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lc2jd_edb0e160-57f6-4631-88ff-4d22d6b51543/dns-node-resolver/0.log" Apr 16 18:12:03.685458 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:03.685417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:12:03.685675 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:03.685593 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:12:03.685675 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:03.685610 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695bcd9585-lcxln: secret "image-registry-tls" not found Apr 16 18:12:03.685675 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:03.685666 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls podName:f0446bd1-52b0-4692-ad20-21395d1e30ad nodeName:}" failed. No retries permitted until 2026-04-16 18:12:11.685651454 +0000 UTC m=+150.890003258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls") pod "image-registry-695bcd9585-lcxln" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad") : secret "image-registry-tls" not found Apr 16 18:12:03.867628 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:03.867601 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-87pfd_767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92/node-ca/0.log" Apr 16 18:12:11.745489 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:11.745436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:12:11.747961 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:11.747934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"image-registry-695bcd9585-lcxln\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:12:11.793675 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:11.793639 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:12:11.912092 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:11.912061 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-695bcd9585-lcxln"] Apr 16 18:12:11.916063 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:11.916031 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0446bd1_52b0_4692_ad20_21395d1e30ad.slice/crio-0687461ea386682a38687130c9a34d4d2dadc364506a716f72120d8dc47774b6 WatchSource:0}: Error finding container 0687461ea386682a38687130c9a34d4d2dadc364506a716f72120d8dc47774b6: Status 404 returned error can't find the container with id 0687461ea386682a38687130c9a34d4d2dadc364506a716f72120d8dc47774b6 Apr 16 18:12:12.724196 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:12.724159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" event={"ID":"f0446bd1-52b0-4692-ad20-21395d1e30ad","Type":"ContainerStarted","Data":"5f120711df08442e9853772b1679b55d3ea4ec16a67c56a5fb8901df1330449b"} Apr 16 18:12:12.724196 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:12.724203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" event={"ID":"f0446bd1-52b0-4692-ad20-21395d1e30ad","Type":"ContainerStarted","Data":"0687461ea386682a38687130c9a34d4d2dadc364506a716f72120d8dc47774b6"} Apr 16 18:12:12.724431 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:12.724274 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:12:17.258807 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:17.258748 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lp6zw" podUID="05e20dd4-40f5-487a-b14b-4e0d5b971aeb" Apr 16 18:12:17.273904 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:17.273871 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vvblr" podUID="040d516b-7ed2-4aa4-b0b4-bc6131bac5cf" Apr 16 18:12:17.317867 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:17.317832 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bbpzd" podUID="d4d545a6-5b19-4165-9bd6-f5c19acf145a" Apr 16 18:12:17.732782 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:17.732748 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp6zw" Apr 16 18:12:22.128435 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.128375 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:12:22.130670 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.130649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05e20dd4-40f5-487a-b14b-4e0d5b971aeb-metrics-tls\") pod \"dns-default-lp6zw\" (UID: \"05e20dd4-40f5-487a-b14b-4e0d5b971aeb\") " pod="openshift-dns/dns-default-lp6zw" Apr 16 18:12:22.229172 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.229130 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:12:22.231683 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.231650 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/040d516b-7ed2-4aa4-b0b4-bc6131bac5cf-cert\") pod \"ingress-canary-vvblr\" (UID: \"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf\") " pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:12:22.236222 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.236201 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bnqh9\"" Apr 16 18:12:22.244195 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.244174 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp6zw" Apr 16 18:12:22.362045 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.361998 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" podStartSLOduration=27.361982916 podStartE2EDuration="27.361982916s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:12.746209789 +0000 UTC m=+151.950561614" watchObservedRunningTime="2026-04-16 18:12:22.361982916 +0000 UTC m=+161.566334742" Apr 16 18:12:22.362875 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.362851 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lp6zw"] Apr 16 18:12:22.366417 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:22.366385 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05e20dd4_40f5_487a_b14b_4e0d5b971aeb.slice/crio-8bba716a24142522f4029aaeeaba6cac178212e55a5c898b25e03bcdbab22564 WatchSource:0}: Error finding container 8bba716a24142522f4029aaeeaba6cac178212e55a5c898b25e03bcdbab22564: Status 404 returned error can't find the container with id 8bba716a24142522f4029aaeeaba6cac178212e55a5c898b25e03bcdbab22564 Apr 16 18:12:22.743199 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:22.743168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp6zw" event={"ID":"05e20dd4-40f5-487a-b14b-4e0d5b971aeb","Type":"ContainerStarted","Data":"8bba716a24142522f4029aaeeaba6cac178212e55a5c898b25e03bcdbab22564"} Apr 16 18:12:24.749878 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:24.749836 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp6zw" event={"ID":"05e20dd4-40f5-487a-b14b-4e0d5b971aeb","Type":"ContainerStarted","Data":"78cba1b009b0b878365ffcbb3bec4b38109023b566ac662abf6a0dc8972017fc"} Apr 16 18:12:24.749878 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:24.749874 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp6zw" event={"ID":"05e20dd4-40f5-487a-b14b-4e0d5b971aeb","Type":"ContainerStarted","Data":"72d108bc4ac6e43e802004ea77eb7a998e557e3cefee9a160a3c005407aef7d5"} Apr 16 18:12:24.750281 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:24.749984 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lp6zw" Apr 16 18:12:24.769339 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:24.769303 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lp6zw" podStartSLOduration=129.445533783 podStartE2EDuration="2m10.769293933s" podCreationTimestamp="2026-04-16 18:10:14 +0000 UTC" firstStartedPulling="2026-04-16 18:12:22.368742208 +0000 UTC m=+161.573094018" lastFinishedPulling="2026-04-16 18:12:23.692502361 +0000 UTC m=+162.896854168" observedRunningTime="2026-04-16 18:12:24.76835459 +0000 UTC m=+163.972706417" watchObservedRunningTime="2026-04-16 18:12:24.769293933 +0000 UTC m=+163.973645761" Apr 16 18:12:26.448201 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.448164 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qn9c9"] Apr 16 18:12:26.451378 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.451355 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.453808 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.453787 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:12:26.454663 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.454645 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:12:26.454949 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.454928 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:12:26.455037 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.454947 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:12:26.455037 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.454929 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-9j6xp\"" Apr 16 18:12:26.461858 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.461833 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qn9c9"] Apr 16 18:12:26.510682 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.510647 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-695bcd9585-lcxln"] Apr 16 18:12:26.514746 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.514705 2567 patch_prober.go:28] interesting pod/image-registry-695bcd9585-lcxln container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:12:26.514895 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.514766 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" podUID="f0446bd1-52b0-4692-ad20-21395d1e30ad" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:26.554492 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.554456 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-pkr4q"] Apr 16 18:12:26.557359 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.557336 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-pkr4q" Apr 16 18:12:26.557534 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.557510 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/59c9a688-bbaa-4127-917e-ea403858f0a6-crio-socket\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.557619 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.557575 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djx7z\" (UniqueName: \"kubernetes.io/projected/59c9a688-bbaa-4127-917e-ea403858f0a6-kube-api-access-djx7z\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.557619 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.557606 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/59c9a688-bbaa-4127-917e-ea403858f0a6-data-volume\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.557723 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.557637 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/59c9a688-bbaa-4127-917e-ea403858f0a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.557776 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.557731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/59c9a688-bbaa-4127-917e-ea403858f0a6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.561324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.561309 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-fw7cn\"" Apr 16 18:12:26.561495 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.561476 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:12:26.561574 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.561475 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:12:26.582126 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.582095 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-pkr4q"] Apr 16 18:12:26.658423 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.658383 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/59c9a688-bbaa-4127-917e-ea403858f0a6-crio-socket\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.659235 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.658430 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djx7z\" (UniqueName: \"kubernetes.io/projected/59c9a688-bbaa-4127-917e-ea403858f0a6-kube-api-access-djx7z\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.659235 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.658896 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/59c9a688-bbaa-4127-917e-ea403858f0a6-crio-socket\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.659235 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.658899 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/59c9a688-bbaa-4127-917e-ea403858f0a6-data-volume\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.659235 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.658983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/59c9a688-bbaa-4127-917e-ea403858f0a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.659235 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.659236 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/59c9a688-bbaa-4127-917e-ea403858f0a6-data-volume\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.659594 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.659349 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb98s\" (UniqueName: \"kubernetes.io/projected/2e637d56-5f0b-48b7-8dac-7c0dca151d31-kube-api-access-pb98s\") pod \"downloads-586b57c7b4-pkr4q\" (UID: \"2e637d56-5f0b-48b7-8dac-7c0dca151d31\") " pod="openshift-console/downloads-586b57c7b4-pkr4q" Apr 16 18:12:26.659594 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.659406 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/59c9a688-bbaa-4127-917e-ea403858f0a6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.662133 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.659976 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/59c9a688-bbaa-4127-917e-ea403858f0a6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.662645 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.662407 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/59c9a688-bbaa-4127-917e-ea403858f0a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.671118 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.671095 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djx7z\" (UniqueName: \"kubernetes.io/projected/59c9a688-bbaa-4127-917e-ea403858f0a6-kube-api-access-djx7z\") pod \"insights-runtime-extractor-qn9c9\" (UID: \"59c9a688-bbaa-4127-917e-ea403858f0a6\") " pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.760193 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.760114 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qn9c9" Apr 16 18:12:26.760322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.760249 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb98s\" (UniqueName: \"kubernetes.io/projected/2e637d56-5f0b-48b7-8dac-7c0dca151d31-kube-api-access-pb98s\") pod \"downloads-586b57c7b4-pkr4q\" (UID: \"2e637d56-5f0b-48b7-8dac-7c0dca151d31\") " pod="openshift-console/downloads-586b57c7b4-pkr4q" Apr 16 18:12:26.772050 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.772029 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb98s\" (UniqueName: \"kubernetes.io/projected/2e637d56-5f0b-48b7-8dac-7c0dca151d31-kube-api-access-pb98s\") pod \"downloads-586b57c7b4-pkr4q\" (UID: \"2e637d56-5f0b-48b7-8dac-7c0dca151d31\") " pod="openshift-console/downloads-586b57c7b4-pkr4q" Apr 16 18:12:26.866330 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.866297 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-pkr4q" Apr 16 18:12:26.889473 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.889443 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qn9c9"] Apr 16 18:12:26.892340 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:26.892301 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59c9a688_bbaa_4127_917e_ea403858f0a6.slice/crio-2af8f0bb6e1ae724586b51f7c042f23c78d65501142d71198b17bc188a5ca0bf WatchSource:0}: Error finding container 2af8f0bb6e1ae724586b51f7c042f23c78d65501142d71198b17bc188a5ca0bf: Status 404 returned error can't find the container with id 2af8f0bb6e1ae724586b51f7c042f23c78d65501142d71198b17bc188a5ca0bf Apr 16 18:12:26.986086 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:26.986058 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-pkr4q"] Apr 16 18:12:26.989692 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:26.989666 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e637d56_5f0b_48b7_8dac_7c0dca151d31.slice/crio-5188f77d001555c4f02f2e105cf863edf33e9ef7980f832a2e2b2adaa2a1b003 WatchSource:0}: Error finding container 5188f77d001555c4f02f2e105cf863edf33e9ef7980f832a2e2b2adaa2a1b003: Status 404 returned error can't find the container with id 5188f77d001555c4f02f2e105cf863edf33e9ef7980f832a2e2b2adaa2a1b003 Apr 16 18:12:27.759278 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:27.759240 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-pkr4q" event={"ID":"2e637d56-5f0b-48b7-8dac-7c0dca151d31","Type":"ContainerStarted","Data":"5188f77d001555c4f02f2e105cf863edf33e9ef7980f832a2e2b2adaa2a1b003"} Apr 16 18:12:27.761178 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:27.761138 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qn9c9" event={"ID":"59c9a688-bbaa-4127-917e-ea403858f0a6","Type":"ContainerStarted","Data":"24085c39f1ab124bf3ca20320e733fc65830beb4b187d3f9d19acc5ebbbec2eb"} Apr 16 18:12:27.761178 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:27.761176 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qn9c9" event={"ID":"59c9a688-bbaa-4127-917e-ea403858f0a6","Type":"ContainerStarted","Data":"288c3bea3ed05ba657f533f4fa0c64cca3f16a3c29d33d70f9383b3d6f101bd4"} Apr 16 18:12:27.761340 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:27.761190 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qn9c9" event={"ID":"59c9a688-bbaa-4127-917e-ea403858f0a6","Type":"ContainerStarted","Data":"2af8f0bb6e1ae724586b51f7c042f23c78d65501142d71198b17bc188a5ca0bf"} Apr 16 18:12:28.306592 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:28.306539 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:12:28.306790 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:28.306729 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:12:28.309730 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:28.309712 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ffhs9\"" Apr 16 18:12:28.317130 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:28.317104 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vvblr" Apr 16 18:12:28.467170 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:28.467137 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vvblr"] Apr 16 18:12:28.470043 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:28.470011 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod040d516b_7ed2_4aa4_b0b4_bc6131bac5cf.slice/crio-f6e4503dc04968a9726142f818c77417d59745df8f6446a1a55fc5445c5731b6 WatchSource:0}: Error finding container f6e4503dc04968a9726142f818c77417d59745df8f6446a1a55fc5445c5731b6: Status 404 returned error can't find the container with id f6e4503dc04968a9726142f818c77417d59745df8f6446a1a55fc5445c5731b6 Apr 16 18:12:28.765013 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:28.764975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vvblr" event={"ID":"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf","Type":"ContainerStarted","Data":"f6e4503dc04968a9726142f818c77417d59745df8f6446a1a55fc5445c5731b6"} Apr 16 18:12:29.770457 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:29.770400 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qn9c9" event={"ID":"59c9a688-bbaa-4127-917e-ea403858f0a6","Type":"ContainerStarted","Data":"a40d8d9d54e7429ba6da82409bd996981e8a8123f099997ac90e2ff6018b187f"} Apr 16 18:12:29.794619 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:29.794536 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qn9c9" podStartSLOduration=1.613923878 podStartE2EDuration="3.794515652s" podCreationTimestamp="2026-04-16 18:12:26 +0000 UTC" firstStartedPulling="2026-04-16 18:12:26.956358372 +0000 UTC m=+166.160710177" lastFinishedPulling="2026-04-16 18:12:29.136950132 +0000 UTC m=+168.341301951" observedRunningTime="2026-04-16 18:12:29.793407871 +0000 UTC m=+168.997759699" watchObservedRunningTime="2026-04-16 18:12:29.794515652 +0000 UTC m=+168.998867479" Apr 16 18:12:30.136946 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.136913 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k"] Apr 16 18:12:30.141266 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.141238 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" Apr 16 18:12:30.143882 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.143860 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:12:30.144010 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.143890 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-j8mls\"" Apr 16 18:12:30.148974 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.148935 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k"] Apr 16 18:12:30.287416 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.287331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5541e9a7-264e-46c1-96dd-157bccc85ffe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-p9d8k\" (UID: \"5541e9a7-264e-46c1-96dd-157bccc85ffe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" Apr 16 18:12:30.388941 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.388832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5541e9a7-264e-46c1-96dd-157bccc85ffe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-p9d8k\" (UID: \"5541e9a7-264e-46c1-96dd-157bccc85ffe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" Apr 16 18:12:30.391749 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.391718 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5541e9a7-264e-46c1-96dd-157bccc85ffe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-p9d8k\" (UID: \"5541e9a7-264e-46c1-96dd-157bccc85ffe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" Apr 16 18:12:30.453448 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.453315 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" Apr 16 18:12:30.584822 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.584756 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k"] Apr 16 18:12:30.589657 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:30.589623 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5541e9a7_264e_46c1_96dd_157bccc85ffe.slice/crio-750bf8e84faa16ac8111c5927c59df27d78eb345c4459e607932b02b111d986f WatchSource:0}: Error finding container 750bf8e84faa16ac8111c5927c59df27d78eb345c4459e607932b02b111d986f: Status 404 returned error can't find the container with id 750bf8e84faa16ac8111c5927c59df27d78eb345c4459e607932b02b111d986f Apr 16 18:12:30.774186 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.774092 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vvblr" event={"ID":"040d516b-7ed2-4aa4-b0b4-bc6131bac5cf","Type":"ContainerStarted","Data":"5e07b8277a955836a479e4ff46cbee6e8bbd393c6a585fc04181967fa1a4261c"} Apr 16 18:12:30.775474 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.775443 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" event={"ID":"5541e9a7-264e-46c1-96dd-157bccc85ffe","Type":"ContainerStarted","Data":"750bf8e84faa16ac8111c5927c59df27d78eb345c4459e607932b02b111d986f"} Apr 16 18:12:30.797203 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:30.797129 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vvblr" podStartSLOduration=134.832093536 podStartE2EDuration="2m16.797111793s" podCreationTimestamp="2026-04-16 18:10:14 +0000 UTC" firstStartedPulling="2026-04-16 18:12:28.472215494 +0000 UTC m=+167.676567298" lastFinishedPulling="2026-04-16 18:12:30.437233749 +0000 UTC m=+169.641585555" observedRunningTime="2026-04-16 18:12:30.796838472 +0000 UTC m=+170.001190299" watchObservedRunningTime="2026-04-16 18:12:30.797111793 +0000 UTC m=+170.001463621" Apr 16 18:12:32.270099 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:32.269998 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" podUID="20183c7a-22bd-4fbc-b9ae-34b39f843753" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:12:32.783089 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:32.783048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" event={"ID":"5541e9a7-264e-46c1-96dd-157bccc85ffe","Type":"ContainerStarted","Data":"7bfa5cf06189c820660ad063fd6d19b88d4ffdeaa39addeeea96ce35e35fe6cc"} Apr 16 18:12:32.783395 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:32.783369 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" Apr 16 18:12:32.789991 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:32.789963 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" Apr 16 18:12:32.799948 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:32.799910 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p9d8k" podStartSLOduration=1.46362714 podStartE2EDuration="2.79989633s" podCreationTimestamp="2026-04-16 18:12:30 +0000 UTC" firstStartedPulling="2026-04-16 18:12:30.59194154 +0000 UTC m=+169.796293348" lastFinishedPulling="2026-04-16 18:12:31.92821072 +0000 UTC m=+171.132562538" observedRunningTime="2026-04-16 18:12:32.799469132 +0000 UTC m=+172.003820959" watchObservedRunningTime="2026-04-16 18:12:32.79989633 +0000 UTC m=+172.004248147" Apr 16 18:12:34.755740 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:34.755709 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lp6zw" Apr 16 18:12:36.516052 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:36.516017 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:12:42.270649 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.270605 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" podUID="20183c7a-22bd-4fbc-b9ae-34b39f843753" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:12:42.791365 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.791332 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-fcl87"] Apr 16 18:12:42.795892 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.795868 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.798369 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.798340 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-t9z4k\"" Apr 16 18:12:42.799094 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.799070 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:12:42.799261 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.799079 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:12:42.799261 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.799205 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:12:42.799261 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.799242 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:12:42.799514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.799299 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:12:42.799514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.799347 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:12:42.804332 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.804285 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-fcl87"] Apr 16 18:12:42.885028 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.884992 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.885243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.885039 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbnkx\" (UniqueName: \"kubernetes.io/projected/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-api-access-cbnkx\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.885243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.885073 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fd9f939d-a19f-43a9-aba5-792530142e5a-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.885243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.885103 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.885243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.885171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd9f939d-a19f-43a9-aba5-792530142e5a-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.885243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.885219 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.942648 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.942610 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-znszh"] Apr 16 18:12:42.948244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.948211 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.950522 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.950499 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:12:42.950522 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.950516 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:12:42.950747 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.950501 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qbj8r\"" Apr 16 18:12:42.950747 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.950599 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:12:42.986631 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986592 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-wtmp\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.986829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986654 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9a13485-2c15-41ed-bd77-dcd33a714804-metrics-client-ca\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.986829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-textfile\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.986829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n99t\" (UniqueName: \"kubernetes.io/projected/f9a13485-2c15-41ed-bd77-dcd33a714804-kube-api-access-4n99t\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.986829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-root\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.987057 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-tls\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.987057 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986897 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-sys\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.987057 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986941 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-accelerators-collector-config\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.987057 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.986985 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.987057 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbnkx\" (UniqueName: \"kubernetes.io/projected/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-api-access-cbnkx\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.987057 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fd9f939d-a19f-43a9-aba5-792530142e5a-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.987339 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987082 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.987339 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd9f939d-a19f-43a9-aba5-792530142e5a-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.987339 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987146 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.987339 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987179 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:42.987592 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987551 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fd9f939d-a19f-43a9-aba5-792530142e5a-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.987935 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987888 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.988055 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.987965 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd9f939d-a19f-43a9-aba5-792530142e5a-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.989869 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.989835 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.990280 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.990253 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:42.998928 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:42.998896 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbnkx\" (UniqueName: \"kubernetes.io/projected/fd9f939d-a19f-43a9-aba5-792530142e5a-kube-api-access-cbnkx\") pod \"kube-state-metrics-7479c89684-fcl87\" (UID: \"fd9f939d-a19f-43a9-aba5-792530142e5a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:43.088047 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.087997 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-textfile\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088047 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n99t\" (UniqueName: \"kubernetes.io/projected/f9a13485-2c15-41ed-bd77-dcd33a714804-kube-api-access-4n99t\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088150 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-root\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-tls\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088212 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-sys\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088246 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-accelerators-collector-config\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088484 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-sys\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088484 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088484 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-textfile\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088484 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088209 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-root\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088484 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:43.088318 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:12:43.088484 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088437 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-wtmp\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088791 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:43.088494 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-tls podName:f9a13485-2c15-41ed-bd77-dcd33a714804 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:43.588466859 +0000 UTC m=+182.792818675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-tls") pod "node-exporter-znszh" (UID: "f9a13485-2c15-41ed-bd77-dcd33a714804") : secret "node-exporter-tls" not found Apr 16 18:12:43.088791 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088518 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9a13485-2c15-41ed-bd77-dcd33a714804-metrics-client-ca\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.088791 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.088598 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-wtmp\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.089061 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.089029 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9a13485-2c15-41ed-bd77-dcd33a714804-metrics-client-ca\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.089130 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.089053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-accelerators-collector-config\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.090952 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.090931 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.100444 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.100414 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n99t\" (UniqueName: \"kubernetes.io/projected/f9a13485-2c15-41ed-bd77-dcd33a714804-kube-api-access-4n99t\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.106899 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.106870 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" Apr 16 18:12:43.298934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.298907 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-fcl87"] Apr 16 18:12:43.310672 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:43.310635 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd9f939d_a19f_43a9_aba5_792530142e5a.slice/crio-2275953c9fd7baedaa322a5a6576e5f01c8586cfab443975899ce0f2174d59e7 WatchSource:0}: Error finding container 2275953c9fd7baedaa322a5a6576e5f01c8586cfab443975899ce0f2174d59e7: Status 404 returned error can't find the container with id 2275953c9fd7baedaa322a5a6576e5f01c8586cfab443975899ce0f2174d59e7 Apr 16 18:12:43.592814 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.592774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-tls\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.596297 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.596269 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9a13485-2c15-41ed-bd77-dcd33a714804-node-exporter-tls\") pod \"node-exporter-znszh\" (UID: \"f9a13485-2c15-41ed-bd77-dcd33a714804\") " pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.814989 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.814948 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-pkr4q" event={"ID":"2e637d56-5f0b-48b7-8dac-7c0dca151d31","Type":"ContainerStarted","Data":"39b462653b841484aea7b31f4713ee9985d91c88d83ec71b34d11d7f83deeade"} Apr 16 18:12:43.815418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.815282 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-pkr4q" Apr 16 18:12:43.817262 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.816809 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" event={"ID":"fd9f939d-a19f-43a9-aba5-792530142e5a","Type":"ContainerStarted","Data":"2275953c9fd7baedaa322a5a6576e5f01c8586cfab443975899ce0f2174d59e7"} Apr 16 18:12:43.834362 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.834330 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-pkr4q" Apr 16 18:12:43.843407 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.841900 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-pkr4q" podStartSLOduration=1.55479234 podStartE2EDuration="17.841879046s" podCreationTimestamp="2026-04-16 18:12:26 +0000 UTC" firstStartedPulling="2026-04-16 18:12:26.991311609 +0000 UTC m=+166.195663413" lastFinishedPulling="2026-04-16 18:12:43.27839831 +0000 UTC m=+182.482750119" observedRunningTime="2026-04-16 18:12:43.84061458 +0000 UTC m=+183.044966427" watchObservedRunningTime="2026-04-16 18:12:43.841879046 +0000 UTC m=+183.046230873" Apr 16 18:12:43.859029 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.858991 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-znszh" Apr 16 18:12:43.871140 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:43.871102 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a13485_2c15_41ed_bd77_dcd33a714804.slice/crio-d3016bf2e427b3f48575be2592c02d60f07c8f102e73e183486166a909db3f2e WatchSource:0}: Error finding container d3016bf2e427b3f48575be2592c02d60f07c8f102e73e183486166a909db3f2e: Status 404 returned error can't find the container with id d3016bf2e427b3f48575be2592c02d60f07c8f102e73e183486166a909db3f2e Apr 16 18:12:43.989577 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.989520 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:43.995021 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.994971 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:43.999868 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.999779 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:12:44.000028 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.999870 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:12:44.000028 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:43.999901 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:12:44.000736 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.000329 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:12:44.000736 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.000339 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:12:44.000736 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.000483 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-645vg\"" Apr 16 18:12:44.000736 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.000610 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:12:44.000736 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.000680 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:12:44.001439 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.001275 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:12:44.001439 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.001317 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:12:44.011195 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.011163 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:44.096987 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.096803 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-config-volume\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.096987 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.096854 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/188bb96a-e2ed-4695-ab48-9de2bfac7427-tls-assets\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.096987 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.096875 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097286 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097034 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097286 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097105 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/188bb96a-e2ed-4695-ab48-9de2bfac7427-config-out\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097286 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097134 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097474 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097413 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-web-config\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097548 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097492 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097548 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097530 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097704 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097552 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfmwc\" (UniqueName: \"kubernetes.io/projected/188bb96a-e2ed-4695-ab48-9de2bfac7427-kube-api-access-cfmwc\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097704 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097704 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.097704 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.097638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198174 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-web-config\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198234 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198296 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfmwc\" (UniqueName: \"kubernetes.io/projected/188bb96a-e2ed-4695-ab48-9de2bfac7427-kube-api-access-cfmwc\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198333 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198359 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198385 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198423 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-config-volume\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198446 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/188bb96a-e2ed-4695-ab48-9de2bfac7427-tls-assets\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.198587 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.198513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.200443 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.199312 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/188bb96a-e2ed-4695-ab48-9de2bfac7427-config-out\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.200443 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.199356 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.200443 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:44.199534 2567 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:12:44.200443 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:44.199626 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-trusted-ca-bundle podName:188bb96a-e2ed-4695-ab48-9de2bfac7427 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:44.699599977 +0000 UTC m=+183.903951805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "188bb96a-e2ed-4695-ab48-9de2bfac7427") : configmap references non-existent config key: ca-bundle.crt Apr 16 18:12:44.200443 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:12:44.199654 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-main-tls podName:188bb96a-e2ed-4695-ab48-9de2bfac7427 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:44.699642958 +0000 UTC m=+183.903994762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "188bb96a-e2ed-4695-ab48-9de2bfac7427") : secret "alertmanager-main-tls" not found Apr 16 18:12:44.200443 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.199930 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.201482 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.200657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.204254 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.204159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.204442 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.204386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-web-config\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.204612 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.204592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/188bb96a-e2ed-4695-ab48-9de2bfac7427-config-out\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.204966 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.204938 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.205376 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.205353 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/188bb96a-e2ed-4695-ab48-9de2bfac7427-tls-assets\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.208823 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.208777 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-config-volume\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.209411 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.209369 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.209967 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.209928 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.213137 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.213091 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfmwc\" (UniqueName: \"kubernetes.io/projected/188bb96a-e2ed-4695-ab48-9de2bfac7427-kube-api-access-cfmwc\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.703750 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.703703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.704827 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.703924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.704993 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.704915 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/188bb96a-e2ed-4695-ab48-9de2bfac7427-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.707084 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.706960 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/188bb96a-e2ed-4695-ab48-9de2bfac7427-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"188bb96a-e2ed-4695-ab48-9de2bfac7427\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:44.823274 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.823236 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-znszh" event={"ID":"f9a13485-2c15-41ed-bd77-dcd33a714804","Type":"ContainerStarted","Data":"d3016bf2e427b3f48575be2592c02d60f07c8f102e73e183486166a909db3f2e"} Apr 16 18:12:44.918603 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:44.918548 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:45.112653 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:45.112582 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:45.115423 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:45.115387 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188bb96a_e2ed_4695_ab48_9de2bfac7427.slice/crio-34b53983cb8c564857a2e04bdea1693a1d294fe6afad2187b37a923ca5b15adc WatchSource:0}: Error finding container 34b53983cb8c564857a2e04bdea1693a1d294fe6afad2187b37a923ca5b15adc: Status 404 returned error can't find the container with id 34b53983cb8c564857a2e04bdea1693a1d294fe6afad2187b37a923ca5b15adc Apr 16 18:12:45.827520 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:45.827479 2567 generic.go:358] "Generic (PLEG): container finished" podID="f9a13485-2c15-41ed-bd77-dcd33a714804" containerID="b5e37ca8f72ef0fc5bb3d5699d12cfcbc71f1985dc93f5edb6cd2e321cd6e7f2" exitCode=0 Apr 16 18:12:45.828007 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:45.827592 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-znszh" event={"ID":"f9a13485-2c15-41ed-bd77-dcd33a714804","Type":"ContainerDied","Data":"b5e37ca8f72ef0fc5bb3d5699d12cfcbc71f1985dc93f5edb6cd2e321cd6e7f2"} Apr 16 18:12:45.829279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:45.829253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"188bb96a-e2ed-4695-ab48-9de2bfac7427","Type":"ContainerStarted","Data":"34b53983cb8c564857a2e04bdea1693a1d294fe6afad2187b37a923ca5b15adc"} Apr 16 18:12:45.831699 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:45.831668 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" event={"ID":"fd9f939d-a19f-43a9-aba5-792530142e5a","Type":"ContainerStarted","Data":"3ab57e255f73b1c45a7f7ab5b7be7ace2a9a372cb3265492226bdca047d1a94f"} Apr 16 18:12:45.831821 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:45.831707 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" event={"ID":"fd9f939d-a19f-43a9-aba5-792530142e5a","Type":"ContainerStarted","Data":"a50a5c60be96545f4b26c742c30dd0d9d2cc27a52261b79b74273c33fe019806"} Apr 16 18:12:45.831821 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:45.831721 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" event={"ID":"fd9f939d-a19f-43a9-aba5-792530142e5a","Type":"ContainerStarted","Data":"8802f7bdbe19cdee0914dbfc9dd91ae731710bbf314494cc6e6900869f1d45d2"} Apr 16 18:12:45.865053 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:45.864997 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-fcl87" podStartSLOduration=2.376915141 podStartE2EDuration="3.864977201s" podCreationTimestamp="2026-04-16 18:12:42 +0000 UTC" firstStartedPulling="2026-04-16 18:12:43.31238485 +0000 UTC m=+182.516736654" lastFinishedPulling="2026-04-16 18:12:44.800446904 +0000 UTC m=+184.004798714" observedRunningTime="2026-04-16 18:12:45.864456727 +0000 UTC m=+185.068808571" watchObservedRunningTime="2026-04-16 18:12:45.864977201 +0000 UTC m=+185.069329028" Apr 16 18:12:46.836501 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:46.836470 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-znszh" event={"ID":"f9a13485-2c15-41ed-bd77-dcd33a714804","Type":"ContainerStarted","Data":"c7552450d21fcf99a02cdd496eaee1e881c0b92833de976bcb62fff0a5bb498c"} Apr 16 18:12:46.836501 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:46.836506 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-znszh" event={"ID":"f9a13485-2c15-41ed-bd77-dcd33a714804","Type":"ContainerStarted","Data":"0d3a6391f134590185ae5b78239f23f431bb62f62ae710fcd8420d0e8c22593b"} Apr 16 18:12:46.856466 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:46.856422 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-znszh" podStartSLOduration=3.9279487250000003 podStartE2EDuration="4.856375587s" podCreationTimestamp="2026-04-16 18:12:42 +0000 UTC" firstStartedPulling="2026-04-16 18:12:43.873683968 +0000 UTC m=+183.078035773" lastFinishedPulling="2026-04-16 18:12:44.802110823 +0000 UTC m=+184.006462635" observedRunningTime="2026-04-16 18:12:46.855634597 +0000 UTC m=+186.059986424" watchObservedRunningTime="2026-04-16 18:12:46.856375587 +0000 UTC m=+186.060727414" Apr 16 18:12:47.256466 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.256428 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-66c678675d-p6mc8"] Apr 16 18:12:47.273883 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.273855 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-66c678675d-p6mc8"] Apr 16 18:12:47.274055 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.273993 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.276576 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.276540 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:12:47.276693 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.276593 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:12:47.276693 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.276640 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:12:47.277426 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.277403 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:12:47.277519 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.277431 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-tc4cq\"" Apr 16 18:12:47.277924 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.277883 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-fnp7r5ovbgujo\"" Apr 16 18:12:47.329607 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.329553 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-client-ca-bundle\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.329778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.329639 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-secret-metrics-server-tls\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.329778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.329668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-secret-metrics-server-client-certs\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.329778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.329692 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/08380e44-f4bb-4d91-b88f-becbbd96a33d-audit-log\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.329910 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.329771 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqwd7\" (UniqueName: \"kubernetes.io/projected/08380e44-f4bb-4d91-b88f-becbbd96a33d-kube-api-access-xqwd7\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.329910 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.329799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/08380e44-f4bb-4d91-b88f-becbbd96a33d-metrics-server-audit-profiles\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.329910 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.329821 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08380e44-f4bb-4d91-b88f-becbbd96a33d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.431135 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.431088 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-secret-metrics-server-tls\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.431322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.431147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-secret-metrics-server-client-certs\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.431322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.431174 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/08380e44-f4bb-4d91-b88f-becbbd96a33d-audit-log\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.431322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.431224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqwd7\" (UniqueName: \"kubernetes.io/projected/08380e44-f4bb-4d91-b88f-becbbd96a33d-kube-api-access-xqwd7\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.431322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.431251 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/08380e44-f4bb-4d91-b88f-becbbd96a33d-metrics-server-audit-profiles\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.431322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.431285 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08380e44-f4bb-4d91-b88f-becbbd96a33d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.431603 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.431345 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-client-ca-bundle\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.431764 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.431737 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/08380e44-f4bb-4d91-b88f-becbbd96a33d-audit-log\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.432118 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.432093 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08380e44-f4bb-4d91-b88f-becbbd96a33d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.432771 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.432741 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/08380e44-f4bb-4d91-b88f-becbbd96a33d-metrics-server-audit-profiles\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.434232 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.434212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-secret-metrics-server-client-certs\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.434372 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.434352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-client-ca-bundle\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.440331 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.440310 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqwd7\" (UniqueName: \"kubernetes.io/projected/08380e44-f4bb-4d91-b88f-becbbd96a33d-kube-api-access-xqwd7\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.442107 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.442085 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/08380e44-f4bb-4d91-b88f-becbbd96a33d-secret-metrics-server-tls\") pod \"metrics-server-66c678675d-p6mc8\" (UID: \"08380e44-f4bb-4d91-b88f-becbbd96a33d\") " pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.585159 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.585118 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:12:47.723084 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.723010 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-66c678675d-p6mc8"] Apr 16 18:12:47.725945 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:47.725918 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08380e44_f4bb_4d91_b88f_becbbd96a33d.slice/crio-8423d2ab6d34e59c23bbe5cb2beabe766567c6b2d7ea4ce3bf6c1e00d8567f0b WatchSource:0}: Error finding container 8423d2ab6d34e59c23bbe5cb2beabe766567c6b2d7ea4ce3bf6c1e00d8567f0b: Status 404 returned error can't find the container with id 8423d2ab6d34e59c23bbe5cb2beabe766567c6b2d7ea4ce3bf6c1e00d8567f0b Apr 16 18:12:47.841294 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.841207 2567 generic.go:358] "Generic (PLEG): container finished" podID="188bb96a-e2ed-4695-ab48-9de2bfac7427" containerID="b5840cfc73a613b6fc7265dc02ce08e510d820023538b6519a8bc4d8d23f3c2d" exitCode=0 Apr 16 18:12:47.841766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.841292 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"188bb96a-e2ed-4695-ab48-9de2bfac7427","Type":"ContainerDied","Data":"b5840cfc73a613b6fc7265dc02ce08e510d820023538b6519a8bc4d8d23f3c2d"} Apr 16 18:12:47.842502 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:47.842471 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" event={"ID":"08380e44-f4bb-4d91-b88f-becbbd96a33d","Type":"ContainerStarted","Data":"8423d2ab6d34e59c23bbe5cb2beabe766567c6b2d7ea4ce3bf6c1e00d8567f0b"} Apr 16 18:12:49.127637 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.127598 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:49.151257 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.151215 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.151928 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.151291 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:49.154655 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.154483 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:12:49.154655 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.154539 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:12:49.154976 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.154758 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:12:49.155190 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.155011 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:12:49.155190 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.155065 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:12:49.155795 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.155314 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jmhtn\"" Apr 16 18:12:49.155795 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.155725 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:12:49.155950 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.155834 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:12:49.155950 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.155837 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:12:49.155950 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.155941 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:12:49.156155 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.156133 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:12:49.156155 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.156148 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5hnqrdrkqr03k\"" Apr 16 18:12:49.156488 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.156470 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:12:49.158471 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.158419 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:12:49.249847 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.249808 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250041 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.249871 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250041 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.249907 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-web-config\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250041 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.249957 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/881d1e93-5753-43ad-9c2a-d449bf97eb14-config-out\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250041 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.249984 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250041 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250012 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250041 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-config\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250342 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250063 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250342 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250144 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/881d1e93-5753-43ad-9c2a-d449bf97eb14-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250342 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250342 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250221 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250342 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250269 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250342 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250307 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250342 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250329 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250723 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250365 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250723 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250431 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsc88\" (UniqueName: \"kubernetes.io/projected/881d1e93-5753-43ad-9c2a-d449bf97eb14-kube-api-access-lsc88\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250723 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250533 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.250723 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.250594 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351469 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351434 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351469 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351471 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351491 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-web-config\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/881d1e93-5753-43ad-9c2a-d449bf97eb14-config-out\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351595 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351623 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-config\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/881d1e93-5753-43ad-9c2a-d449bf97eb14-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351709 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.351752 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.352181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351763 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.352181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351797 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.352181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.352181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.352181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsc88\" (UniqueName: \"kubernetes.io/projected/881d1e93-5753-43ad-9c2a-d449bf97eb14-kube-api-access-lsc88\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.352181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351935 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.352181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.351964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.352873 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.352741 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.353085 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.353059 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.353454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.353346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.354202 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.354123 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.355113 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.354588 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/881d1e93-5753-43ad-9c2a-d449bf97eb14-config-out\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.355113 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.354899 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.356033 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.355440 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-web-config\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.356155 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.356125 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.356319 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.356292 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/881d1e93-5753-43ad-9c2a-d449bf97eb14-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.357418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.357389 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.357663 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.357618 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.357962 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.357919 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.358465 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.358426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.358777 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.358750 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-config\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.358998 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.358977 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.359451 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.359429 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/881d1e93-5753-43ad-9c2a-d449bf97eb14-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.359576 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.359534 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/881d1e93-5753-43ad-9c2a-d449bf97eb14-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.367180 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.367158 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsc88\" (UniqueName: \"kubernetes.io/projected/881d1e93-5753-43ad-9c2a-d449bf97eb14-kube-api-access-lsc88\") pod \"prometheus-k8s-0\" (UID: \"881d1e93-5753-43ad-9c2a-d449bf97eb14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:49.466834 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:49.466733 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:50.406088 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.406009 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:50.407012 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:12:50.406978 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881d1e93_5753_43ad_9c2a_d449bf97eb14.slice/crio-6d877dcc6ae87fbcbf8e016733073f1d29ff96910988ca10b05dfc0143598392 WatchSource:0}: Error finding container 6d877dcc6ae87fbcbf8e016733073f1d29ff96910988ca10b05dfc0143598392: Status 404 returned error can't find the container with id 6d877dcc6ae87fbcbf8e016733073f1d29ff96910988ca10b05dfc0143598392 Apr 16 18:12:50.854716 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.854680 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"188bb96a-e2ed-4695-ab48-9de2bfac7427","Type":"ContainerStarted","Data":"1f5175ba1408340d546878d77a73802c437bfca623f22a7384e1bda1a08ba8fa"} Apr 16 18:12:50.854716 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.854721 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"188bb96a-e2ed-4695-ab48-9de2bfac7427","Type":"ContainerStarted","Data":"bd1cf9d5eba630841cd26da0e2470dd884c7446cc32d191e61655f9fc16e6f1c"} Apr 16 18:12:50.854928 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.854737 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"188bb96a-e2ed-4695-ab48-9de2bfac7427","Type":"ContainerStarted","Data":"bd554fde0c88c5f59a0deb11f8c6ac200bdf0677008f688c81270583e0789b6f"} Apr 16 18:12:50.854928 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.854750 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"188bb96a-e2ed-4695-ab48-9de2bfac7427","Type":"ContainerStarted","Data":"b8f18a430512313d5a33fe948d0091bbefc3cbbdd6e68f14c8f79c01de5317bc"} Apr 16 18:12:50.856270 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.856242 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" event={"ID":"08380e44-f4bb-4d91-b88f-becbbd96a33d","Type":"ContainerStarted","Data":"865c1b3f482376b1837adfb5d1185d462b8f9fbca5dfc0270af1a2d9d66400e3"} Apr 16 18:12:50.857808 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.857734 2567 generic.go:358] "Generic (PLEG): container finished" podID="881d1e93-5753-43ad-9c2a-d449bf97eb14" containerID="0ab67234776e939d9f65f787b42c5244d62d157dc5fce5d0ea81f0aff652fd7a" exitCode=0 Apr 16 18:12:50.857808 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.857768 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"881d1e93-5753-43ad-9c2a-d449bf97eb14","Type":"ContainerDied","Data":"0ab67234776e939d9f65f787b42c5244d62d157dc5fce5d0ea81f0aff652fd7a"} Apr 16 18:12:50.857808 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.857788 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"881d1e93-5753-43ad-9c2a-d449bf97eb14","Type":"ContainerStarted","Data":"6d877dcc6ae87fbcbf8e016733073f1d29ff96910988ca10b05dfc0143598392"} Apr 16 18:12:50.877341 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:50.877297 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" podStartSLOduration=1.3498535440000001 podStartE2EDuration="3.877284322s" podCreationTimestamp="2026-04-16 18:12:47 +0000 UTC" firstStartedPulling="2026-04-16 18:12:47.728056817 +0000 UTC m=+186.932408620" lastFinishedPulling="2026-04-16 18:12:50.255487578 +0000 UTC m=+189.459839398" observedRunningTime="2026-04-16 18:12:50.876203101 +0000 UTC m=+190.080554965" watchObservedRunningTime="2026-04-16 18:12:50.877284322 +0000 UTC m=+190.081636148" Apr 16 18:12:51.530192 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:51.529822 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" podUID="f0446bd1-52b0-4692-ad20-21395d1e30ad" containerName="registry" containerID="cri-o://5f120711df08442e9853772b1679b55d3ea4ec16a67c56a5fb8901df1330449b" gracePeriod=30 Apr 16 18:12:51.864755 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:51.864710 2567 generic.go:358] "Generic (PLEG): container finished" podID="f0446bd1-52b0-4692-ad20-21395d1e30ad" containerID="5f120711df08442e9853772b1679b55d3ea4ec16a67c56a5fb8901df1330449b" exitCode=0 Apr 16 18:12:51.865025 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:51.864808 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" event={"ID":"f0446bd1-52b0-4692-ad20-21395d1e30ad","Type":"ContainerDied","Data":"5f120711df08442e9853772b1679b55d3ea4ec16a67c56a5fb8901df1330449b"} Apr 16 18:12:51.868821 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:51.868792 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"188bb96a-e2ed-4695-ab48-9de2bfac7427","Type":"ContainerStarted","Data":"27620537e58a8e0488b8f7127d59314494afb003401813bd12d0cd9e5a38e9a0"} Apr 16 18:12:52.047486 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.047447 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:12:52.179621 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.179497 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-installation-pull-secrets\") pod \"f0446bd1-52b0-4692-ad20-21395d1e30ad\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " Apr 16 18:12:52.179621 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.179586 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-bound-sa-token\") pod \"f0446bd1-52b0-4692-ad20-21395d1e30ad\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " Apr 16 18:12:52.179959 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.179629 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-image-registry-private-configuration\") pod \"f0446bd1-52b0-4692-ad20-21395d1e30ad\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " Apr 16 18:12:52.179959 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.179660 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llhsj\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-kube-api-access-llhsj\") pod \"f0446bd1-52b0-4692-ad20-21395d1e30ad\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " Apr 16 18:12:52.179959 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.179689 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-trusted-ca\") pod \"f0446bd1-52b0-4692-ad20-21395d1e30ad\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " Apr 16 18:12:52.180120 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.179968 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0446bd1-52b0-4692-ad20-21395d1e30ad-ca-trust-extracted\") pod \"f0446bd1-52b0-4692-ad20-21395d1e30ad\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " Apr 16 18:12:52.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.180055 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-certificates\") pod \"f0446bd1-52b0-4692-ad20-21395d1e30ad\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " Apr 16 18:12:52.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.180325 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f0446bd1-52b0-4692-ad20-21395d1e30ad" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:52.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.180434 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") pod \"f0446bd1-52b0-4692-ad20-21395d1e30ad\" (UID: \"f0446bd1-52b0-4692-ad20-21395d1e30ad\") " Apr 16 18:12:52.181034 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.180752 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-trusted-ca\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:12:52.181289 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.181050 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f0446bd1-52b0-4692-ad20-21395d1e30ad" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:52.183586 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.183320 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f0446bd1-52b0-4692-ad20-21395d1e30ad" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:52.183586 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.183392 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f0446bd1-52b0-4692-ad20-21395d1e30ad" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:52.183586 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.183459 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f0446bd1-52b0-4692-ad20-21395d1e30ad" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:52.184790 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.184761 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f0446bd1-52b0-4692-ad20-21395d1e30ad" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:52.185202 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.185176 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-kube-api-access-llhsj" (OuterVolumeSpecName: "kube-api-access-llhsj") pod "f0446bd1-52b0-4692-ad20-21395d1e30ad" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad"). InnerVolumeSpecName "kube-api-access-llhsj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:52.193006 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.192980 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0446bd1-52b0-4692-ad20-21395d1e30ad-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f0446bd1-52b0-4692-ad20-21395d1e30ad" (UID: "f0446bd1-52b0-4692-ad20-21395d1e30ad"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:52.270483 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.270437 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" podUID="20183c7a-22bd-4fbc-b9ae-34b39f843753" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:12:52.270747 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.270539 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" Apr 16 18:12:52.271322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.271206 2567 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"017385975edaa6ebb4fc0dd42012647aaefaf7b31f1db94981639980ff867649"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 18:12:52.271322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.271276 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" podUID="20183c7a-22bd-4fbc-b9ae-34b39f843753" containerName="service-proxy" containerID="cri-o://017385975edaa6ebb4fc0dd42012647aaefaf7b31f1db94981639980ff867649" gracePeriod=30 Apr 16 18:12:52.281707 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.281672 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-image-registry-private-configuration\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:12:52.281707 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.281700 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llhsj\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-kube-api-access-llhsj\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:12:52.281707 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.281710 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0446bd1-52b0-4692-ad20-21395d1e30ad-ca-trust-extracted\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:12:52.281995 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.281722 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-certificates\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:12:52.281995 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.281736 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-registry-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:12:52.281995 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.281750 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0446bd1-52b0-4692-ad20-21395d1e30ad-installation-pull-secrets\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:12:52.281995 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.281764 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0446bd1-52b0-4692-ad20-21395d1e30ad-bound-sa-token\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:12:52.875411 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.875374 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"188bb96a-e2ed-4695-ab48-9de2bfac7427","Type":"ContainerStarted","Data":"39146992326dd7501ea2d9f928f4c07ede9ea6b0facb0216703d709833f3ae4d"} Apr 16 18:12:52.877673 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.877644 2567 generic.go:358] "Generic (PLEG): container finished" podID="20183c7a-22bd-4fbc-b9ae-34b39f843753" containerID="017385975edaa6ebb4fc0dd42012647aaefaf7b31f1db94981639980ff867649" exitCode=2 Apr 16 18:12:52.877790 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.877717 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" event={"ID":"20183c7a-22bd-4fbc-b9ae-34b39f843753","Type":"ContainerDied","Data":"017385975edaa6ebb4fc0dd42012647aaefaf7b31f1db94981639980ff867649"} Apr 16 18:12:52.877790 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.877754 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f8db7b48-qb5kl" event={"ID":"20183c7a-22bd-4fbc-b9ae-34b39f843753","Type":"ContainerStarted","Data":"21e9cb79bd7d543c4c137335e577dcf6fcb4f528d09ce5d8caadccc8eb4c03c4"} Apr 16 18:12:52.878972 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.878947 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" event={"ID":"f0446bd1-52b0-4692-ad20-21395d1e30ad","Type":"ContainerDied","Data":"0687461ea386682a38687130c9a34d4d2dadc364506a716f72120d8dc47774b6"} Apr 16 18:12:52.879089 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.878975 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-695bcd9585-lcxln" Apr 16 18:12:52.879089 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.878983 2567 scope.go:117] "RemoveContainer" containerID="5f120711df08442e9853772b1679b55d3ea4ec16a67c56a5fb8901df1330449b" Apr 16 18:12:52.905202 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.905132 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.670655637 podStartE2EDuration="9.905112079s" podCreationTimestamp="2026-04-16 18:12:43 +0000 UTC" firstStartedPulling="2026-04-16 18:12:45.117912994 +0000 UTC m=+184.322264801" lastFinishedPulling="2026-04-16 18:12:52.352369431 +0000 UTC m=+191.556721243" observedRunningTime="2026-04-16 18:12:52.903552162 +0000 UTC m=+192.107904031" watchObservedRunningTime="2026-04-16 18:12:52.905112079 +0000 UTC m=+192.109463906" Apr 16 18:12:52.920514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.920481 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-695bcd9585-lcxln"] Apr 16 18:12:52.923638 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:52.923582 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-695bcd9585-lcxln"] Apr 16 18:12:53.316644 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:53.312348 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0446bd1-52b0-4692-ad20-21395d1e30ad" path="/var/lib/kubelet/pods/f0446bd1-52b0-4692-ad20-21395d1e30ad/volumes" Apr 16 18:12:54.886527 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:54.886446 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"881d1e93-5753-43ad-9c2a-d449bf97eb14","Type":"ContainerStarted","Data":"3a32346b62c5423e03cdf0f950a378c939d15410ee5ae1b6d4b85001d181da4e"} Apr 16 18:12:54.886527 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:54.886485 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"881d1e93-5753-43ad-9c2a-d449bf97eb14","Type":"ContainerStarted","Data":"6c10a0741fc83598b0e21408d2d6c6047ed8b256cba3b2f598e76640116a5cd8"} Apr 16 18:12:57.898513 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:57.898476 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"881d1e93-5753-43ad-9c2a-d449bf97eb14","Type":"ContainerStarted","Data":"da7ff815f77e4ed503f0b4e2c7e25be4550511560876d0b68285450337842df6"} Apr 16 18:12:57.898513 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:57.898517 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"881d1e93-5753-43ad-9c2a-d449bf97eb14","Type":"ContainerStarted","Data":"5e347d879f3a0155e7cfa716ef727d4bf00673f32e5dc802f389f81c0a04333d"} Apr 16 18:12:57.899164 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:57.898527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"881d1e93-5753-43ad-9c2a-d449bf97eb14","Type":"ContainerStarted","Data":"27a7b3f7faac7235efb4a05e4145e0093ab16751652f1f33d2dfe8367365697c"} Apr 16 18:12:57.899164 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:57.898535 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"881d1e93-5753-43ad-9c2a-d449bf97eb14","Type":"ContainerStarted","Data":"0c5c5851b3dc3e8e10277fcd8d4126de6e5460bcb1c202d00a584d7ec9e3c2fa"} Apr 16 18:12:57.928455 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:57.928387 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.916186521 podStartE2EDuration="8.92837148s" podCreationTimestamp="2026-04-16 18:12:49 +0000 UTC" firstStartedPulling="2026-04-16 18:12:50.859117827 +0000 UTC m=+190.063469634" lastFinishedPulling="2026-04-16 18:12:56.87130279 +0000 UTC m=+196.075654593" observedRunningTime="2026-04-16 18:12:57.927098729 +0000 UTC m=+197.131450555" watchObservedRunningTime="2026-04-16 18:12:57.92837148 +0000 UTC m=+197.132723309" Apr 16 18:12:59.467623 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:12:59.467584 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:07.585696 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:07.585660 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:13:07.585696 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:07.585704 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:13:27.590573 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:27.590528 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:13:27.594421 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:27.594400 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-66c678675d-p6mc8" Apr 16 18:13:49.467123 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:49.467093 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:49.485082 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:49.485063 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:50.058326 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:50.058305 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:53.156931 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:53.156881 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:13:53.159551 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:53.159523 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4d545a6-5b19-4165-9bd6-f5c19acf145a-metrics-certs\") pod \"network-metrics-daemon-bbpzd\" (UID: \"d4d545a6-5b19-4165-9bd6-f5c19acf145a\") " pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:13:53.209975 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:53.209952 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kjmpp\"" Apr 16 18:13:53.217746 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:53.217724 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbpzd" Apr 16 18:13:53.331764 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:53.331734 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bbpzd"] Apr 16 18:13:53.335734 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:13:53.335707 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d545a6_5b19_4165_9bd6_f5c19acf145a.slice/crio-a61391877ec73356ad3682d94dd0a5a10bab279b5ce443ad362b8e73b3ad384f WatchSource:0}: Error finding container a61391877ec73356ad3682d94dd0a5a10bab279b5ce443ad362b8e73b3ad384f: Status 404 returned error can't find the container with id a61391877ec73356ad3682d94dd0a5a10bab279b5ce443ad362b8e73b3ad384f Apr 16 18:13:54.058035 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:54.057995 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bbpzd" event={"ID":"d4d545a6-5b19-4165-9bd6-f5c19acf145a","Type":"ContainerStarted","Data":"a61391877ec73356ad3682d94dd0a5a10bab279b5ce443ad362b8e73b3ad384f"} Apr 16 18:13:55.062024 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:55.061995 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bbpzd" event={"ID":"d4d545a6-5b19-4165-9bd6-f5c19acf145a","Type":"ContainerStarted","Data":"f59372fbd197ffa0092545d7b00bd22d5a4dbf784149cfc1975683c3d98c4614"} Apr 16 18:13:55.062024 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:55.062027 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bbpzd" event={"ID":"d4d545a6-5b19-4165-9bd6-f5c19acf145a","Type":"ContainerStarted","Data":"7917407c27960648e778d41d8ab5b123ebe66bcd3569905c3ca29473926c3859"} Apr 16 18:13:55.084192 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:13:55.084151 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bbpzd" podStartSLOduration=253.137020011 podStartE2EDuration="4m14.084136809s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:13:53.337419373 +0000 UTC m=+252.541771183" lastFinishedPulling="2026-04-16 18:13:54.284536178 +0000 UTC m=+253.488887981" observedRunningTime="2026-04-16 18:13:55.082154735 +0000 UTC m=+254.286506560" watchObservedRunningTime="2026-04-16 18:13:55.084136809 +0000 UTC m=+254.288488634" Apr 16 18:14:07.294744 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.294660 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7bd779bd54-vndd7"] Apr 16 18:14:07.295125 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.295004 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0446bd1-52b0-4692-ad20-21395d1e30ad" containerName="registry" Apr 16 18:14:07.295125 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.295020 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0446bd1-52b0-4692-ad20-21395d1e30ad" containerName="registry" Apr 16 18:14:07.295125 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.295070 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0446bd1-52b0-4692-ad20-21395d1e30ad" containerName="registry" Apr 16 18:14:07.298939 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.298922 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.303031 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.302997 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:14:07.303031 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.303011 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:14:07.303269 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.303043 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-pzcc7\"" Apr 16 18:14:07.303516 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.303499 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:14:07.303611 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.303502 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:14:07.303704 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.303689 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:14:07.315182 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.315152 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:14:07.315426 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.315406 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7bd779bd54-vndd7"] Apr 16 18:14:07.355631 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.355602 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-telemeter-client-tls\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.355763 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.355638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-federate-client-tls\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.355763 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.355659 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-serving-certs-ca-bundle\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.355763 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.355684 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqnvq\" (UniqueName: \"kubernetes.io/projected/3be674fa-942d-48cb-bea1-2b59304bf983-kube-api-access-cqnvq\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.355864 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.355761 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-metrics-client-ca\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.355864 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.355795 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-secret-telemeter-client\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.355864 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.355836 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.355955 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.355878 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457146 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457112 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-federate-client-tls\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457146 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-serving-certs-ca-bundle\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqnvq\" (UniqueName: \"kubernetes.io/projected/3be674fa-942d-48cb-bea1-2b59304bf983-kube-api-access-cqnvq\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-metrics-client-ca\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457256 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-secret-telemeter-client\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457511 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457511 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457365 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-telemeter-client-tls\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.457941 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.457913 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-serving-certs-ca-bundle\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.458164 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.458141 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-metrics-client-ca\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.458255 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.458197 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be674fa-942d-48cb-bea1-2b59304bf983-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.459872 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.459839 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-federate-client-tls\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.459978 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.459880 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-secret-telemeter-client\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.460043 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.460003 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.460101 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.460078 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3be674fa-942d-48cb-bea1-2b59304bf983-telemeter-client-tls\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.466066 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.466049 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqnvq\" (UniqueName: \"kubernetes.io/projected/3be674fa-942d-48cb-bea1-2b59304bf983-kube-api-access-cqnvq\") pod \"telemeter-client-7bd779bd54-vndd7\" (UID: \"3be674fa-942d-48cb-bea1-2b59304bf983\") " pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.608708 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.608681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" Apr 16 18:14:07.728954 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:07.728924 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7bd779bd54-vndd7"] Apr 16 18:14:07.732184 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:14:07.732159 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3be674fa_942d_48cb_bea1_2b59304bf983.slice/crio-de22dff6634527cbfad7f155a70c3a049d3df9683a3a18b141ff667a19c45fad WatchSource:0}: Error finding container de22dff6634527cbfad7f155a70c3a049d3df9683a3a18b141ff667a19c45fad: Status 404 returned error can't find the container with id de22dff6634527cbfad7f155a70c3a049d3df9683a3a18b141ff667a19c45fad Apr 16 18:14:08.097601 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:08.097570 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" event={"ID":"3be674fa-942d-48cb-bea1-2b59304bf983","Type":"ContainerStarted","Data":"de22dff6634527cbfad7f155a70c3a049d3df9683a3a18b141ff667a19c45fad"} Apr 16 18:14:10.104979 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:10.104948 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" event={"ID":"3be674fa-942d-48cb-bea1-2b59304bf983","Type":"ContainerStarted","Data":"07ae7f9f458ac7e76347a2102ec067e9fcb81b1fb892e4124bde526056b9bb51"} Apr 16 18:14:10.104979 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:10.104982 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" event={"ID":"3be674fa-942d-48cb-bea1-2b59304bf983","Type":"ContainerStarted","Data":"fd6c57bce4de4dbdfd9fd0afa1ab9bc050e81656fc1d2345748ed6f7a2ac4676"} Apr 16 18:14:10.105362 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:10.104991 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" event={"ID":"3be674fa-942d-48cb-bea1-2b59304bf983","Type":"ContainerStarted","Data":"fc8951df7107de966a12ba326c8aafd54480e371bc2f0f07c497a0fb3ec12c3d"} Apr 16 18:14:10.137814 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:10.137767 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7bd779bd54-vndd7" podStartSLOduration=1.645556289 podStartE2EDuration="3.137753693s" podCreationTimestamp="2026-04-16 18:14:07 +0000 UTC" firstStartedPulling="2026-04-16 18:14:07.73394733 +0000 UTC m=+266.938299134" lastFinishedPulling="2026-04-16 18:14:09.226144721 +0000 UTC m=+268.430496538" observedRunningTime="2026-04-16 18:14:10.136898822 +0000 UTC m=+269.341250645" watchObservedRunningTime="2026-04-16 18:14:10.137753693 +0000 UTC m=+269.342105519" Apr 16 18:14:41.204929 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:41.204903 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:14:59.969433 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:59.969402 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-z58pc"] Apr 16 18:14:59.974792 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:59.974773 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:14:59.977048 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:59.977024 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:14:59.984090 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:14:59.984066 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z58pc"] Apr 16 18:15:00.133653 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.133621 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2bb0d16-0207-47e2-ad4a-862fbaea345c-kubelet-config\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.133653 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.133668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2bb0d16-0207-47e2-ad4a-862fbaea345c-dbus\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.133855 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.133690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2bb0d16-0207-47e2-ad4a-862fbaea345c-original-pull-secret\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.234527 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.234458 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2bb0d16-0207-47e2-ad4a-862fbaea345c-dbus\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.234527 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.234498 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2bb0d16-0207-47e2-ad4a-862fbaea345c-original-pull-secret\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.234685 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.234547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2bb0d16-0207-47e2-ad4a-862fbaea345c-kubelet-config\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.234685 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.234635 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2bb0d16-0207-47e2-ad4a-862fbaea345c-kubelet-config\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.234685 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.234665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2bb0d16-0207-47e2-ad4a-862fbaea345c-dbus\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.236653 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.236636 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2bb0d16-0207-47e2-ad4a-862fbaea345c-original-pull-secret\") pod \"global-pull-secret-syncer-z58pc\" (UID: \"c2bb0d16-0207-47e2-ad4a-862fbaea345c\") " pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.284349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.284331 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z58pc" Apr 16 18:15:00.401278 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.401254 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z58pc"] Apr 16 18:15:00.403611 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:15:00.403583 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2bb0d16_0207_47e2_ad4a_862fbaea345c.slice/crio-9ac2f7dda1f353a3901c727b683526dde47fd8b7a3a6a6f4a39872d0e5c1ea6f WatchSource:0}: Error finding container 9ac2f7dda1f353a3901c727b683526dde47fd8b7a3a6a6f4a39872d0e5c1ea6f: Status 404 returned error can't find the container with id 9ac2f7dda1f353a3901c727b683526dde47fd8b7a3a6a6f4a39872d0e5c1ea6f Apr 16 18:15:00.405273 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:00.405257 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:15:01.255706 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:01.255667 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z58pc" event={"ID":"c2bb0d16-0207-47e2-ad4a-862fbaea345c","Type":"ContainerStarted","Data":"9ac2f7dda1f353a3901c727b683526dde47fd8b7a3a6a6f4a39872d0e5c1ea6f"} Apr 16 18:15:05.271857 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:05.271818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z58pc" event={"ID":"c2bb0d16-0207-47e2-ad4a-862fbaea345c","Type":"ContainerStarted","Data":"db97ce650cbd69ba826b651e9f426d2c0ebf4537da17986ec8dc27c62e0f2fda"} Apr 16 18:15:05.290326 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:15:05.290279 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-z58pc" podStartSLOduration=2.150728463 podStartE2EDuration="6.290265617s" podCreationTimestamp="2026-04-16 18:14:59 +0000 UTC" firstStartedPulling="2026-04-16 18:15:00.405379366 +0000 UTC m=+319.609731173" lastFinishedPulling="2026-04-16 18:15:04.544916523 +0000 UTC m=+323.749268327" observedRunningTime="2026-04-16 18:15:05.288659587 +0000 UTC m=+324.493011412" watchObservedRunningTime="2026-04-16 18:15:05.290265617 +0000 UTC m=+324.494617442" Apr 16 18:19:45.673470 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.673439 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct"] Apr 16 18:19:45.676625 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.676607 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.681021 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.680993 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 18:19:45.682269 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.682244 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-w9fgb\"" Apr 16 18:19:45.682386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.682250 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:45.682386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.682303 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 18:19:45.682386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.682252 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:45.682386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.682352 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 18:19:45.687666 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.687646 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct"] Apr 16 18:19:45.772694 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.772668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78238f47-b733-40e0-843a-76b695784854-cert\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.772841 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.772704 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/78238f47-b733-40e0-843a-76b695784854-metrics-cert\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.772841 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.772734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ld44\" (UniqueName: \"kubernetes.io/projected/78238f47-b733-40e0-843a-76b695784854-kube-api-access-2ld44\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.772841 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.772794 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/78238f47-b733-40e0-843a-76b695784854-manager-config\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.874096 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.874062 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/78238f47-b733-40e0-843a-76b695784854-manager-config\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.874262 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.874121 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78238f47-b733-40e0-843a-76b695784854-cert\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.874262 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.874146 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/78238f47-b733-40e0-843a-76b695784854-metrics-cert\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.874262 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.874165 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ld44\" (UniqueName: \"kubernetes.io/projected/78238f47-b733-40e0-843a-76b695784854-kube-api-access-2ld44\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.874841 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.874810 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/78238f47-b733-40e0-843a-76b695784854-manager-config\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.876648 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.876624 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/78238f47-b733-40e0-843a-76b695784854-metrics-cert\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.876754 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.876673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78238f47-b733-40e0-843a-76b695784854-cert\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.888400 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.888378 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ld44\" (UniqueName: \"kubernetes.io/projected/78238f47-b733-40e0-843a-76b695784854-kube-api-access-2ld44\") pod \"lws-controller-manager-65bdb464b4-t68ct\" (UID: \"78238f47-b733-40e0-843a-76b695784854\") " pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:45.990762 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:45.990692 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:46.109610 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:46.109586 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct"] Apr 16 18:19:46.111811 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:19:46.111781 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78238f47_b733_40e0_843a_76b695784854.slice/crio-92dc84d6667ad5344da3e5933948082ea3e963bcc9db48f32b6e02dc4e0ce476 WatchSource:0}: Error finding container 92dc84d6667ad5344da3e5933948082ea3e963bcc9db48f32b6e02dc4e0ce476: Status 404 returned error can't find the container with id 92dc84d6667ad5344da3e5933948082ea3e963bcc9db48f32b6e02dc4e0ce476 Apr 16 18:19:47.057855 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:47.057812 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" event={"ID":"78238f47-b733-40e0-843a-76b695784854","Type":"ContainerStarted","Data":"92dc84d6667ad5344da3e5933948082ea3e963bcc9db48f32b6e02dc4e0ce476"} Apr 16 18:19:49.064591 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:49.064493 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" event={"ID":"78238f47-b733-40e0-843a-76b695784854","Type":"ContainerStarted","Data":"a0e2e55f7c461b410b501bbd6a1410a7f3fff65e6fdc76b7df39bd2cadc3456b"} Apr 16 18:19:49.065063 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:49.064601 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:19:49.088249 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:19:49.088206 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" podStartSLOduration=1.536089755 podStartE2EDuration="4.088194303s" podCreationTimestamp="2026-04-16 18:19:45 +0000 UTC" firstStartedPulling="2026-04-16 18:19:46.113469345 +0000 UTC m=+605.317821149" lastFinishedPulling="2026-04-16 18:19:48.665573889 +0000 UTC m=+607.869925697" observedRunningTime="2026-04-16 18:19:49.087172074 +0000 UTC m=+608.291523904" watchObservedRunningTime="2026-04-16 18:19:49.088194303 +0000 UTC m=+608.292546129" Apr 16 18:20:00.069834 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:00.069738 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-65bdb464b4-t68ct" Apr 16 18:20:33.932070 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:33.932032 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh"] Apr 16 18:20:33.934699 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:33.934683 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:33.936979 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:33.936958 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:20:33.937428 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:33.937407 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-m96lj\"" Apr 16 18:20:33.937510 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:33.937413 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:20:33.947552 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:33.947528 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh"] Apr 16 18:20:34.126929 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.126898 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rw74\" (UniqueName: \"kubernetes.io/projected/67dd972d-b8b2-4a9e-b970-30a3c3e75fac-kube-api-access-7rw74\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bbjnh\" (UID: \"67dd972d-b8b2-4a9e-b970-30a3c3e75fac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:34.127154 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.126971 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/67dd972d-b8b2-4a9e-b970-30a3c3e75fac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bbjnh\" (UID: \"67dd972d-b8b2-4a9e-b970-30a3c3e75fac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:34.227581 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.227482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rw74\" (UniqueName: \"kubernetes.io/projected/67dd972d-b8b2-4a9e-b970-30a3c3e75fac-kube-api-access-7rw74\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bbjnh\" (UID: \"67dd972d-b8b2-4a9e-b970-30a3c3e75fac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:34.227581 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.227539 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/67dd972d-b8b2-4a9e-b970-30a3c3e75fac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bbjnh\" (UID: \"67dd972d-b8b2-4a9e-b970-30a3c3e75fac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:34.227882 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.227864 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/67dd972d-b8b2-4a9e-b970-30a3c3e75fac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bbjnh\" (UID: \"67dd972d-b8b2-4a9e-b970-30a3c3e75fac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:34.237155 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.237124 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rw74\" (UniqueName: \"kubernetes.io/projected/67dd972d-b8b2-4a9e-b970-30a3c3e75fac-kube-api-access-7rw74\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bbjnh\" (UID: \"67dd972d-b8b2-4a9e-b970-30a3c3e75fac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:34.244339 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.244319 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:34.367875 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.367844 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh"] Apr 16 18:20:34.371120 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:20:34.371081 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67dd972d_b8b2_4a9e_b970_30a3c3e75fac.slice/crio-5bd168ec239cd20295dbcf38a7b928be160c53def8cbeb43ab5460969c831a7c WatchSource:0}: Error finding container 5bd168ec239cd20295dbcf38a7b928be160c53def8cbeb43ab5460969c831a7c: Status 404 returned error can't find the container with id 5bd168ec239cd20295dbcf38a7b928be160c53def8cbeb43ab5460969c831a7c Apr 16 18:20:34.375750 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:34.375733 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:20:35.203045 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:35.203005 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" event={"ID":"67dd972d-b8b2-4a9e-b970-30a3c3e75fac","Type":"ContainerStarted","Data":"5bd168ec239cd20295dbcf38a7b928be160c53def8cbeb43ab5460969c831a7c"} Apr 16 18:20:39.218411 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:39.218374 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" event={"ID":"67dd972d-b8b2-4a9e-b970-30a3c3e75fac","Type":"ContainerStarted","Data":"50fb782a27f4ededd16f168180875f09302b9a8b685e173aa2b14e8cf528dbd6"} Apr 16 18:20:39.218832 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:39.218531 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:20:39.239454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:39.239405 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" podStartSLOduration=2.419928169 podStartE2EDuration="6.239393674s" podCreationTimestamp="2026-04-16 18:20:33 +0000 UTC" firstStartedPulling="2026-04-16 18:20:34.375866802 +0000 UTC m=+653.580218609" lastFinishedPulling="2026-04-16 18:20:38.195332309 +0000 UTC m=+657.399684114" observedRunningTime="2026-04-16 18:20:39.238920091 +0000 UTC m=+658.443271946" watchObservedRunningTime="2026-04-16 18:20:39.239393674 +0000 UTC m=+658.443745499" Apr 16 18:20:50.223190 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:20:50.223163 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bbjnh" Apr 16 18:21:25.616911 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.616878 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-pvznl"] Apr 16 18:21:25.623488 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.623467 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:25.625806 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.625785 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 18:21:25.625922 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.625785 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4727g\"" Apr 16 18:21:25.631070 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.631050 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-pvznl"] Apr 16 18:21:25.655655 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.655633 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-pvznl"] Apr 16 18:21:25.722880 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.722857 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ac5f7a6-ea18-44b2-bcc7-9d234001848b-config-file\") pod \"limitador-limitador-67566c68b4-pvznl\" (UID: \"1ac5f7a6-ea18-44b2-bcc7-9d234001848b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:25.722954 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.722885 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l44r\" (UniqueName: \"kubernetes.io/projected/1ac5f7a6-ea18-44b2-bcc7-9d234001848b-kube-api-access-5l44r\") pod \"limitador-limitador-67566c68b4-pvznl\" (UID: \"1ac5f7a6-ea18-44b2-bcc7-9d234001848b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:25.823750 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.823727 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ac5f7a6-ea18-44b2-bcc7-9d234001848b-config-file\") pod \"limitador-limitador-67566c68b4-pvznl\" (UID: \"1ac5f7a6-ea18-44b2-bcc7-9d234001848b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:25.823836 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.823754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5l44r\" (UniqueName: \"kubernetes.io/projected/1ac5f7a6-ea18-44b2-bcc7-9d234001848b-kube-api-access-5l44r\") pod \"limitador-limitador-67566c68b4-pvznl\" (UID: \"1ac5f7a6-ea18-44b2-bcc7-9d234001848b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:25.824255 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.824238 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ac5f7a6-ea18-44b2-bcc7-9d234001848b-config-file\") pod \"limitador-limitador-67566c68b4-pvznl\" (UID: \"1ac5f7a6-ea18-44b2-bcc7-9d234001848b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:25.832879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.832854 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l44r\" (UniqueName: \"kubernetes.io/projected/1ac5f7a6-ea18-44b2-bcc7-9d234001848b-kube-api-access-5l44r\") pod \"limitador-limitador-67566c68b4-pvznl\" (UID: \"1ac5f7a6-ea18-44b2-bcc7-9d234001848b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:25.935145 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:25.935092 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:26.050935 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.050855 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-pvznl"] Apr 16 18:21:26.053196 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:21:26.053169 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ac5f7a6_ea18_44b2_bcc7_9d234001848b.slice/crio-cff71011fcff9bb9fef52308d247fe281b32feb05284118f3949e7798050f1b8 WatchSource:0}: Error finding container cff71011fcff9bb9fef52308d247fe281b32feb05284118f3949e7798050f1b8: Status 404 returned error can't find the container with id cff71011fcff9bb9fef52308d247fe281b32feb05284118f3949e7798050f1b8 Apr 16 18:21:26.306688 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.306623 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-74khj"] Apr 16 18:21:26.311300 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.311282 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-74khj" Apr 16 18:21:26.313411 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.313391 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wmjdn\"" Apr 16 18:21:26.317364 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.317343 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-74khj"] Apr 16 18:21:26.366901 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.366875 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" event={"ID":"1ac5f7a6-ea18-44b2-bcc7-9d234001848b","Type":"ContainerStarted","Data":"cff71011fcff9bb9fef52308d247fe281b32feb05284118f3949e7798050f1b8"} Apr 16 18:21:26.430358 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.430333 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw6fw\" (UniqueName: \"kubernetes.io/projected/4e25028d-f4ff-4bae-9d36-80c01f270f4c-kube-api-access-sw6fw\") pod \"authorino-79cbc94b89-74khj\" (UID: \"4e25028d-f4ff-4bae-9d36-80c01f270f4c\") " pod="kuadrant-system/authorino-79cbc94b89-74khj" Apr 16 18:21:26.530975 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.530950 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw6fw\" (UniqueName: \"kubernetes.io/projected/4e25028d-f4ff-4bae-9d36-80c01f270f4c-kube-api-access-sw6fw\") pod \"authorino-79cbc94b89-74khj\" (UID: \"4e25028d-f4ff-4bae-9d36-80c01f270f4c\") " pod="kuadrant-system/authorino-79cbc94b89-74khj" Apr 16 18:21:26.539273 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.539253 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw6fw\" (UniqueName: \"kubernetes.io/projected/4e25028d-f4ff-4bae-9d36-80c01f270f4c-kube-api-access-sw6fw\") pod \"authorino-79cbc94b89-74khj\" (UID: \"4e25028d-f4ff-4bae-9d36-80c01f270f4c\") " pod="kuadrant-system/authorino-79cbc94b89-74khj" Apr 16 18:21:26.621806 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.621788 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-74khj" Apr 16 18:21:26.739509 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:26.739464 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-74khj"] Apr 16 18:21:26.742513 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:21:26.742487 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e25028d_f4ff_4bae_9d36_80c01f270f4c.slice/crio-4ce6f37aef1ecc56764736944749b02c5bc090768164e48a4af119007f2df4db WatchSource:0}: Error finding container 4ce6f37aef1ecc56764736944749b02c5bc090768164e48a4af119007f2df4db: Status 404 returned error can't find the container with id 4ce6f37aef1ecc56764736944749b02c5bc090768164e48a4af119007f2df4db Apr 16 18:21:27.371907 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:27.371873 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-74khj" event={"ID":"4e25028d-f4ff-4bae-9d36-80c01f270f4c","Type":"ContainerStarted","Data":"4ce6f37aef1ecc56764736944749b02c5bc090768164e48a4af119007f2df4db"} Apr 16 18:21:30.381953 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:30.381886 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" event={"ID":"1ac5f7a6-ea18-44b2-bcc7-9d234001848b","Type":"ContainerStarted","Data":"d83be414eb5d60a129d8e07cb6081dd2e5c8a16d6f5e75ccb080f8705fc75459"} Apr 16 18:21:30.382297 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:30.382020 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:30.402602 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:30.402539 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" podStartSLOduration=1.648164307 podStartE2EDuration="5.402526662s" podCreationTimestamp="2026-04-16 18:21:25 +0000 UTC" firstStartedPulling="2026-04-16 18:21:26.054840889 +0000 UTC m=+705.259192694" lastFinishedPulling="2026-04-16 18:21:29.809203246 +0000 UTC m=+709.013555049" observedRunningTime="2026-04-16 18:21:30.401118837 +0000 UTC m=+709.605470687" watchObservedRunningTime="2026-04-16 18:21:30.402526662 +0000 UTC m=+709.606878519" Apr 16 18:21:32.390854 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:32.390821 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-74khj" event={"ID":"4e25028d-f4ff-4bae-9d36-80c01f270f4c","Type":"ContainerStarted","Data":"c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106"} Apr 16 18:21:32.405782 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:32.405732 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-74khj" podStartSLOduration=0.987562268 podStartE2EDuration="6.405715686s" podCreationTimestamp="2026-04-16 18:21:26 +0000 UTC" firstStartedPulling="2026-04-16 18:21:26.744104759 +0000 UTC m=+705.948456580" lastFinishedPulling="2026-04-16 18:21:32.162258179 +0000 UTC m=+711.366609998" observedRunningTime="2026-04-16 18:21:32.405100689 +0000 UTC m=+711.609452528" watchObservedRunningTime="2026-04-16 18:21:32.405715686 +0000 UTC m=+711.610067514" Apr 16 18:21:41.385888 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:41.385863 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-pvznl" Apr 16 18:21:52.155535 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.155481 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-74khj"] Apr 16 18:21:52.156013 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.155785 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-74khj" podUID="4e25028d-f4ff-4bae-9d36-80c01f270f4c" containerName="authorino" containerID="cri-o://c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106" gracePeriod=30 Apr 16 18:21:52.401322 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.401300 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-74khj" Apr 16 18:21:52.429775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.429705 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw6fw\" (UniqueName: \"kubernetes.io/projected/4e25028d-f4ff-4bae-9d36-80c01f270f4c-kube-api-access-sw6fw\") pod \"4e25028d-f4ff-4bae-9d36-80c01f270f4c\" (UID: \"4e25028d-f4ff-4bae-9d36-80c01f270f4c\") " Apr 16 18:21:52.432180 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.432150 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e25028d-f4ff-4bae-9d36-80c01f270f4c-kube-api-access-sw6fw" (OuterVolumeSpecName: "kube-api-access-sw6fw") pod "4e25028d-f4ff-4bae-9d36-80c01f270f4c" (UID: "4e25028d-f4ff-4bae-9d36-80c01f270f4c"). InnerVolumeSpecName "kube-api-access-sw6fw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:52.457764 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.457742 2567 generic.go:358] "Generic (PLEG): container finished" podID="4e25028d-f4ff-4bae-9d36-80c01f270f4c" containerID="c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106" exitCode=0 Apr 16 18:21:52.457853 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.457789 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-74khj" Apr 16 18:21:52.457853 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.457803 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-74khj" event={"ID":"4e25028d-f4ff-4bae-9d36-80c01f270f4c","Type":"ContainerDied","Data":"c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106"} Apr 16 18:21:52.457853 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.457826 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-74khj" event={"ID":"4e25028d-f4ff-4bae-9d36-80c01f270f4c","Type":"ContainerDied","Data":"4ce6f37aef1ecc56764736944749b02c5bc090768164e48a4af119007f2df4db"} Apr 16 18:21:52.457853 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.457848 2567 scope.go:117] "RemoveContainer" containerID="c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106" Apr 16 18:21:52.465253 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.465222 2567 scope.go:117] "RemoveContainer" containerID="c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106" Apr 16 18:21:52.465530 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:21:52.465502 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106\": container with ID starting with c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106 not found: ID does not exist" containerID="c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106" Apr 16 18:21:52.465596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.465525 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106"} err="failed to get container status \"c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106\": rpc error: code = NotFound desc = could not find container \"c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106\": container with ID starting with c2bfa4e88ee0ed926b85614f3c053ee6f3473e1d0bfeeed069088a9e8c4d5106 not found: ID does not exist" Apr 16 18:21:52.483006 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.482979 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-74khj"] Apr 16 18:21:52.487198 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.487179 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-74khj"] Apr 16 18:21:52.531156 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:52.531133 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sw6fw\" (UniqueName: \"kubernetes.io/projected/4e25028d-f4ff-4bae-9d36-80c01f270f4c-kube-api-access-sw6fw\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:21:53.310514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:21:53.310486 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e25028d-f4ff-4bae-9d36-80c01f270f4c" path="/var/lib/kubelet/pods/4e25028d-f4ff-4bae-9d36-80c01f270f4c/volumes" Apr 16 18:23:37.347223 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.347185 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-wq7x2"] Apr 16 18:23:37.347714 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.347530 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e25028d-f4ff-4bae-9d36-80c01f270f4c" containerName="authorino" Apr 16 18:23:37.347714 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.347544 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e25028d-f4ff-4bae-9d36-80c01f270f4c" containerName="authorino" Apr 16 18:23:37.347714 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.347628 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e25028d-f4ff-4bae-9d36-80c01f270f4c" containerName="authorino" Apr 16 18:23:37.350435 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.350417 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wq7x2" Apr 16 18:23:37.353092 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.353066 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:23:37.353092 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.353072 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:23:37.354166 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.353902 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:23:37.354278 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.354029 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wqstx\"" Apr 16 18:23:37.361087 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.358918 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-wq7x2"] Apr 16 18:23:37.413670 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.413649 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv99\" (UniqueName: \"kubernetes.io/projected/006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a-kube-api-access-dwv99\") pod \"s3-init-wq7x2\" (UID: \"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a\") " pod="kserve/s3-init-wq7x2" Apr 16 18:23:37.514714 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.514692 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwv99\" (UniqueName: \"kubernetes.io/projected/006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a-kube-api-access-dwv99\") pod \"s3-init-wq7x2\" (UID: \"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a\") " pod="kserve/s3-init-wq7x2" Apr 16 18:23:37.524682 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.524663 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwv99\" (UniqueName: \"kubernetes.io/projected/006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a-kube-api-access-dwv99\") pod \"s3-init-wq7x2\" (UID: \"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a\") " pod="kserve/s3-init-wq7x2" Apr 16 18:23:37.666824 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.666763 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wq7x2" Apr 16 18:23:37.786602 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.786547 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-wq7x2"] Apr 16 18:23:37.789544 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:23:37.789512 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod006ba9e6_35c5_4e2e_b6b9_fa60ace64d7a.slice/crio-bec756a2c3934e560c6fb7a40aa62227608046598e8c2ddf4fefb29d30ddb5c7 WatchSource:0}: Error finding container bec756a2c3934e560c6fb7a40aa62227608046598e8c2ddf4fefb29d30ddb5c7: Status 404 returned error can't find the container with id bec756a2c3934e560c6fb7a40aa62227608046598e8c2ddf4fefb29d30ddb5c7 Apr 16 18:23:37.809609 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:37.809580 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wq7x2" event={"ID":"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a","Type":"ContainerStarted","Data":"bec756a2c3934e560c6fb7a40aa62227608046598e8c2ddf4fefb29d30ddb5c7"} Apr 16 18:23:42.116488 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:42.116464 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:23:42.829580 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:42.829527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wq7x2" event={"ID":"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a","Type":"ContainerStarted","Data":"bd339ecdc742cb31c95cad6915bb63feefdb09709deb84620dbf401ebb1c9d58"} Apr 16 18:23:42.846147 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:42.846101 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-wq7x2" podStartSLOduration=1.5234193870000001 podStartE2EDuration="5.846088973s" podCreationTimestamp="2026-04-16 18:23:37 +0000 UTC" firstStartedPulling="2026-04-16 18:23:37.791479382 +0000 UTC m=+836.995831192" lastFinishedPulling="2026-04-16 18:23:42.114148974 +0000 UTC m=+841.318500778" observedRunningTime="2026-04-16 18:23:42.845025595 +0000 UTC m=+842.049377420" watchObservedRunningTime="2026-04-16 18:23:42.846088973 +0000 UTC m=+842.050440800" Apr 16 18:23:45.841136 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:45.841097 2567 generic.go:358] "Generic (PLEG): container finished" podID="006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a" containerID="bd339ecdc742cb31c95cad6915bb63feefdb09709deb84620dbf401ebb1c9d58" exitCode=0 Apr 16 18:23:45.841543 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:45.841177 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wq7x2" event={"ID":"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a","Type":"ContainerDied","Data":"bd339ecdc742cb31c95cad6915bb63feefdb09709deb84620dbf401ebb1c9d58"} Apr 16 18:23:46.968099 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:46.968078 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wq7x2" Apr 16 18:23:47.092829 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:47.092800 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwv99\" (UniqueName: \"kubernetes.io/projected/006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a-kube-api-access-dwv99\") pod \"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a\" (UID: \"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a\") " Apr 16 18:23:47.094885 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:47.094855 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a-kube-api-access-dwv99" (OuterVolumeSpecName: "kube-api-access-dwv99") pod "006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a" (UID: "006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a"). InnerVolumeSpecName "kube-api-access-dwv99". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:23:47.193808 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:47.193777 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dwv99\" (UniqueName: \"kubernetes.io/projected/006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a-kube-api-access-dwv99\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:23:47.849403 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:47.849365 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wq7x2" event={"ID":"006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a","Type":"ContainerDied","Data":"bec756a2c3934e560c6fb7a40aa62227608046598e8c2ddf4fefb29d30ddb5c7"} Apr 16 18:23:47.849403 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:47.849399 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bec756a2c3934e560c6fb7a40aa62227608046598e8c2ddf4fefb29d30ddb5c7" Apr 16 18:23:47.849403 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:47.849374 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wq7x2" Apr 16 18:23:58.141461 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.141431 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d"] Apr 16 18:23:58.141973 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.141951 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a" containerName="s3-init" Apr 16 18:23:58.142045 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.141975 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a" containerName="s3-init" Apr 16 18:23:58.142102 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.142057 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a" containerName="s3-init" Apr 16 18:23:58.145155 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.145133 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.147694 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.147671 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:23:58.147817 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.147694 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:23:58.147817 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.147671 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 18:23:58.148043 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.148023 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-lqxh7\"" Apr 16 18:23:58.157620 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.157598 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d"] Apr 16 18:23:58.275667 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275644 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.275764 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275681 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.275764 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275702 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.275764 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275719 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.275877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275790 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.275877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275833 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.275877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275864 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzsgs\" (UniqueName: \"kubernetes.io/projected/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-kube-api-access-dzsgs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.275971 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275903 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.275971 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.275937 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.376654 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.376633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.376766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.376673 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.376766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.376704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.376766 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.376724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.376919 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.376852 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.376978 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.376916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.376978 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.376953 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.377061 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.377031 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzsgs\" (UniqueName: \"kubernetes.io/projected/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-kube-api-access-dzsgs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.377115 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.377080 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.377115 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.377104 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.377211 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.377151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.377211 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.377175 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.377656 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.377622 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.377806 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.377787 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.379488 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.379466 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.379703 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.379687 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.386598 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.386579 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.387005 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.386985 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzsgs\" (UniqueName: \"kubernetes.io/projected/30a2bae1-62f9-4614-ab04-2a74ee4cbd67-kube-api-access-dzsgs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-9kd8d\" (UID: \"30a2bae1-62f9-4614-ab04-2a74ee4cbd67\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.457492 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.457440 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:23:58.577649 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.577593 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d"] Apr 16 18:23:58.579755 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:23:58.579730 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a2bae1_62f9_4614_ab04_2a74ee4cbd67.slice/crio-d26340b39f34f40f55e6560d9f1c6fbeaf5d4700ea0894578a8ec0e75ed66d00 WatchSource:0}: Error finding container d26340b39f34f40f55e6560d9f1c6fbeaf5d4700ea0894578a8ec0e75ed66d00: Status 404 returned error can't find the container with id d26340b39f34f40f55e6560d9f1c6fbeaf5d4700ea0894578a8ec0e75ed66d00 Apr 16 18:23:58.883986 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:23:58.883962 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" event={"ID":"30a2bae1-62f9-4614-ab04-2a74ee4cbd67","Type":"ContainerStarted","Data":"d26340b39f34f40f55e6560d9f1c6fbeaf5d4700ea0894578a8ec0e75ed66d00"} Apr 16 18:24:01.098568 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:01.098515 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236212Ki","pods":"250"} Apr 16 18:24:01.098953 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:01.098676 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236212Ki","pods":"250"} Apr 16 18:24:01.098953 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:01.098724 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236212Ki","pods":"250"} Apr 16 18:24:01.895938 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:01.895898 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" event={"ID":"30a2bae1-62f9-4614-ab04-2a74ee4cbd67","Type":"ContainerStarted","Data":"95680e419225783c0b04873b2cecacd3ffe45c8e04585620b74666e0727ac6b7"} Apr 16 18:24:01.917979 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:01.917930 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" podStartSLOduration=1.401335481 podStartE2EDuration="3.917917156s" podCreationTimestamp="2026-04-16 18:23:58 +0000 UTC" firstStartedPulling="2026-04-16 18:23:58.581748786 +0000 UTC m=+857.786100594" lastFinishedPulling="2026-04-16 18:24:01.098330464 +0000 UTC m=+860.302682269" observedRunningTime="2026-04-16 18:24:01.915797044 +0000 UTC m=+861.120148869" watchObservedRunningTime="2026-04-16 18:24:01.917917156 +0000 UTC m=+861.122268976" Apr 16 18:24:02.457716 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:02.457687 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:24:02.462528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:02.462506 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:24:02.899142 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:02.899114 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:24:02.899984 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:02.899966 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-9kd8d" Apr 16 18:24:08.694121 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.694089 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg"] Apr 16 18:24:08.697324 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.697304 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.699948 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.699924 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:24:08.700597 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.700574 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 18:24:08.700697 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.700612 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-77c8b\"" Apr 16 18:24:08.709778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.709752 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg"] Apr 16 18:24:08.855148 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.855112 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.855274 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.855182 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl5qr\" (UniqueName: \"kubernetes.io/projected/a981f916-553d-4f89-9ef5-1c6ebc20772b-kube-api-access-bl5qr\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.855274 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.855253 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.855378 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.855298 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.855378 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.855322 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.855475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.855402 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a981f916-553d-4f89-9ef5-1c6ebc20772b-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.955894 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.955835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bl5qr\" (UniqueName: \"kubernetes.io/projected/a981f916-553d-4f89-9ef5-1c6ebc20772b-kube-api-access-bl5qr\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.955894 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.955872 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.956064 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.955905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.956064 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.955938 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.956064 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.955986 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a981f916-553d-4f89-9ef5-1c6ebc20772b-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.956064 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.956019 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.956275 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.956256 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.956312 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.956290 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.956404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.956384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.956509 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.956387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.958370 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.958350 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a981f916-553d-4f89-9ef5-1c6ebc20772b-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:08.965042 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:08.965014 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl5qr\" (UniqueName: \"kubernetes.io/projected/a981f916-553d-4f89-9ef5-1c6ebc20772b-kube-api-access-bl5qr\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:09.007019 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:09.006989 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:09.123371 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:09.123345 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg"] Apr 16 18:24:09.126000 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:24:09.125975 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda981f916_553d_4f89_9ef5_1c6ebc20772b.slice/crio-6b5dbcd7845cd4dc51d22d34a0a14734dbc30c5c2e25805537ddd87ac92fdf90 WatchSource:0}: Error finding container 6b5dbcd7845cd4dc51d22d34a0a14734dbc30c5c2e25805537ddd87ac92fdf90: Status 404 returned error can't find the container with id 6b5dbcd7845cd4dc51d22d34a0a14734dbc30c5c2e25805537ddd87ac92fdf90 Apr 16 18:24:09.922353 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:09.922306 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" event={"ID":"a981f916-553d-4f89-9ef5-1c6ebc20772b","Type":"ContainerStarted","Data":"6b5dbcd7845cd4dc51d22d34a0a14734dbc30c5c2e25805537ddd87ac92fdf90"} Apr 16 18:24:12.933761 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:12.933724 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" event={"ID":"a981f916-553d-4f89-9ef5-1c6ebc20772b","Type":"ContainerStarted","Data":"8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5"} Apr 16 18:24:13.938248 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:13.938220 2567 generic.go:358] "Generic (PLEG): container finished" podID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerID="8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5" exitCode=0 Apr 16 18:24:13.938626 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:13.938273 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" event={"ID":"a981f916-553d-4f89-9ef5-1c6ebc20772b","Type":"ContainerDied","Data":"8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5"} Apr 16 18:24:15.946524 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:15.946488 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" event={"ID":"a981f916-553d-4f89-9ef5-1c6ebc20772b","Type":"ContainerStarted","Data":"b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6"} Apr 16 18:24:46.057906 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:46.057861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" event={"ID":"a981f916-553d-4f89-9ef5-1c6ebc20772b","Type":"ContainerStarted","Data":"4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab"} Apr 16 18:24:46.058294 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:46.058102 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:46.060524 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:46.060506 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:46.081833 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:46.081782 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" podStartSLOduration=1.358581819 podStartE2EDuration="38.08177009s" podCreationTimestamp="2026-04-16 18:24:08 +0000 UTC" firstStartedPulling="2026-04-16 18:24:09.127864838 +0000 UTC m=+868.332216642" lastFinishedPulling="2026-04-16 18:24:45.851053094 +0000 UTC m=+905.055404913" observedRunningTime="2026-04-16 18:24:46.07908194 +0000 UTC m=+905.283433766" watchObservedRunningTime="2026-04-16 18:24:46.08177009 +0000 UTC m=+905.286121917" Apr 16 18:24:49.008124 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:49.008089 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:49.008124 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:49.008131 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:59.008950 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:59.008919 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:24:59.010053 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:24:59.010034 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:25:20.396268 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:20.396235 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg"] Apr 16 18:25:20.396710 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:20.396604 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="main" containerID="cri-o://b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6" gracePeriod=30 Apr 16 18:25:20.396710 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:20.396604 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="tokenizer" containerID="cri-o://4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab" gracePeriod=30 Apr 16 18:25:21.176128 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.176097 2567 generic.go:358] "Generic (PLEG): container finished" podID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerID="b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6" exitCode=0 Apr 16 18:25:21.176325 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.176165 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" event={"ID":"a981f916-553d-4f89-9ef5-1c6ebc20772b","Type":"ContainerDied","Data":"b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6"} Apr 16 18:25:21.549190 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.549167 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:25:21.573295 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573268 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a981f916-553d-4f89-9ef5-1c6ebc20772b-tls-certs\") pod \"a981f916-553d-4f89-9ef5-1c6ebc20772b\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " Apr 16 18:25:21.573421 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573327 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-uds\") pod \"a981f916-553d-4f89-9ef5-1c6ebc20772b\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " Apr 16 18:25:21.573421 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573362 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-cache\") pod \"a981f916-553d-4f89-9ef5-1c6ebc20772b\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " Apr 16 18:25:21.573421 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573391 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-tmp\") pod \"a981f916-553d-4f89-9ef5-1c6ebc20772b\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " Apr 16 18:25:21.573601 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573435 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl5qr\" (UniqueName: \"kubernetes.io/projected/a981f916-553d-4f89-9ef5-1c6ebc20772b-kube-api-access-bl5qr\") pod \"a981f916-553d-4f89-9ef5-1c6ebc20772b\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " Apr 16 18:25:21.573601 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573487 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-kserve-provision-location\") pod \"a981f916-553d-4f89-9ef5-1c6ebc20772b\" (UID: \"a981f916-553d-4f89-9ef5-1c6ebc20772b\") " Apr 16 18:25:21.573718 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573656 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a981f916-553d-4f89-9ef5-1c6ebc20772b" (UID: "a981f916-553d-4f89-9ef5-1c6ebc20772b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:25:21.573808 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573790 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-uds\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:25:21.573887 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.573861 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a981f916-553d-4f89-9ef5-1c6ebc20772b" (UID: "a981f916-553d-4f89-9ef5-1c6ebc20772b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:25:21.574038 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.574018 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a981f916-553d-4f89-9ef5-1c6ebc20772b" (UID: "a981f916-553d-4f89-9ef5-1c6ebc20772b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:25:21.574271 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.574249 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a981f916-553d-4f89-9ef5-1c6ebc20772b" (UID: "a981f916-553d-4f89-9ef5-1c6ebc20772b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:25:21.575773 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.575752 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a981f916-553d-4f89-9ef5-1c6ebc20772b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a981f916-553d-4f89-9ef5-1c6ebc20772b" (UID: "a981f916-553d-4f89-9ef5-1c6ebc20772b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:25:21.575857 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.575778 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a981f916-553d-4f89-9ef5-1c6ebc20772b-kube-api-access-bl5qr" (OuterVolumeSpecName: "kube-api-access-bl5qr") pod "a981f916-553d-4f89-9ef5-1c6ebc20772b" (UID: "a981f916-553d-4f89-9ef5-1c6ebc20772b"). InnerVolumeSpecName "kube-api-access-bl5qr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:25:21.674176 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.674149 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a981f916-553d-4f89-9ef5-1c6ebc20772b-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:25:21.674176 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.674175 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:25:21.674337 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.674185 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-tokenizer-tmp\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:25:21.674337 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.674194 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bl5qr\" (UniqueName: \"kubernetes.io/projected/a981f916-553d-4f89-9ef5-1c6ebc20772b-kube-api-access-bl5qr\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:25:21.674337 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:21.674204 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a981f916-553d-4f89-9ef5-1c6ebc20772b-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:25:22.181464 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.181431 2567 generic.go:358] "Generic (PLEG): container finished" podID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerID="4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab" exitCode=0 Apr 16 18:25:22.181663 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.181502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" event={"ID":"a981f916-553d-4f89-9ef5-1c6ebc20772b","Type":"ContainerDied","Data":"4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab"} Apr 16 18:25:22.181663 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.181517 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" Apr 16 18:25:22.181663 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.181536 2567 scope.go:117] "RemoveContainer" containerID="4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab" Apr 16 18:25:22.181663 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.181527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg" event={"ID":"a981f916-553d-4f89-9ef5-1c6ebc20772b","Type":"ContainerDied","Data":"6b5dbcd7845cd4dc51d22d34a0a14734dbc30c5c2e25805537ddd87ac92fdf90"} Apr 16 18:25:22.193066 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.193043 2567 scope.go:117] "RemoveContainer" containerID="b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6" Apr 16 18:25:22.200415 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.200389 2567 scope.go:117] "RemoveContainer" containerID="8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5" Apr 16 18:25:22.206785 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.206762 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg"] Apr 16 18:25:22.207903 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.207888 2567 scope.go:117] "RemoveContainer" containerID="4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab" Apr 16 18:25:22.208168 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:25:22.208151 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab\": container with ID starting with 4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab not found: ID does not exist" containerID="4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab" Apr 16 18:25:22.208243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.208181 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab"} err="failed to get container status \"4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab\": rpc error: code = NotFound desc = could not find container \"4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab\": container with ID starting with 4a1b3762d6734183653d564c9ee5245e26bc32e43966f207384faac821ab3aab not found: ID does not exist" Apr 16 18:25:22.208243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.208205 2567 scope.go:117] "RemoveContainer" containerID="b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6" Apr 16 18:25:22.208452 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:25:22.208430 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6\": container with ID starting with b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6 not found: ID does not exist" containerID="b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6" Apr 16 18:25:22.208512 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.208462 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6"} err="failed to get container status \"b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6\": rpc error: code = NotFound desc = could not find container \"b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6\": container with ID starting with b5fe78855055ff65b7814154865ef71a7a29a7b13f076928ff1760f0a96a76e6 not found: ID does not exist" Apr 16 18:25:22.208512 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.208484 2567 scope.go:117] "RemoveContainer" containerID="8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5" Apr 16 18:25:22.208701 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:25:22.208683 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5\": container with ID starting with 8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5 not found: ID does not exist" containerID="8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5" Apr 16 18:25:22.208744 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.208707 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5"} err="failed to get container status \"8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5\": rpc error: code = NotFound desc = could not find container \"8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5\": container with ID starting with 8fc62e1eb87507f5e7a00f66331d4f27073b3e777d0227a6b4d579f7cf2c19c5 not found: ID does not exist" Apr 16 18:25:22.213579 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:22.213544 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-757b752xdg"] Apr 16 18:25:23.310763 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:23.310735 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" path="/var/lib/kubelet/pods/a981f916-553d-4f89-9ef5-1c6ebc20772b/volumes" Apr 16 18:25:32.698927 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.698891 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959"] Apr 16 18:25:32.699404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.699197 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="storage-initializer" Apr 16 18:25:32.699404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.699207 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="storage-initializer" Apr 16 18:25:32.699404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.699220 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="main" Apr 16 18:25:32.699404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.699228 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="main" Apr 16 18:25:32.699404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.699261 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="tokenizer" Apr 16 18:25:32.699404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.699267 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="tokenizer" Apr 16 18:25:32.699404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.699321 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="main" Apr 16 18:25:32.699404 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.699328 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a981f916-553d-4f89-9ef5-1c6ebc20772b" containerName="tokenizer" Apr 16 18:25:32.704018 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.703998 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.707037 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.707014 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:25:32.707151 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.707014 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 18:25:32.710323 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.710297 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959"] Apr 16 18:25:32.757950 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.757908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6txs\" (UniqueName: \"kubernetes.io/projected/426804c4-e2c6-453d-85e3-f1aaef3128bd-kube-api-access-t6txs\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.757950 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.757952 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-home\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.758122 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.757985 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.758122 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.758003 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-model-cache\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.758122 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.758023 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426804c4-e2c6-453d-85e3-f1aaef3128bd-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.758122 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.758050 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-dshm\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.858684 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.858657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-home\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.858804 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.858704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.858804 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.858724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-model-cache\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.858804 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.858755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426804c4-e2c6-453d-85e3-f1aaef3128bd-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.858804 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.858803 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-dshm\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.859022 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.858856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6txs\" (UniqueName: \"kubernetes.io/projected/426804c4-e2c6-453d-85e3-f1aaef3128bd-kube-api-access-t6txs\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.859127 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.859094 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-home\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.859194 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.859130 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-model-cache\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.859194 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.859186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.861069 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.861049 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-dshm\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.861224 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.861209 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426804c4-e2c6-453d-85e3-f1aaef3128bd-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.866964 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.866920 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6txs\" (UniqueName: \"kubernetes.io/projected/426804c4-e2c6-453d-85e3-f1aaef3128bd-kube-api-access-t6txs\") pod \"scheduler-ha-replicas-test-kserve-769474c8cf-rm959\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:32.968273 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.968207 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d"] Apr 16 18:25:32.971759 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.971743 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:32.975191 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.975172 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-fv66f\"" Apr 16 18:25:32.988544 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:32.988521 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d"] Apr 16 18:25:33.015080 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.015059 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:33.060502 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.060469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.060688 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.060531 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.060763 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.060707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fmx4\" (UniqueName: \"kubernetes.io/projected/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kube-api-access-8fmx4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.060867 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.060815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.060983 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.060892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.061049 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.061011 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.140154 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.140125 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959"] Apr 16 18:25:33.142659 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:25:33.142633 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod426804c4_e2c6_453d_85e3_f1aaef3128bd.slice/crio-6bdac39397122fad20ca5aacc54e288f085d4e9bc35b8915ae9f2a671a4a25c6 WatchSource:0}: Error finding container 6bdac39397122fad20ca5aacc54e288f085d4e9bc35b8915ae9f2a671a4a25c6: Status 404 returned error can't find the container with id 6bdac39397122fad20ca5aacc54e288f085d4e9bc35b8915ae9f2a671a4a25c6 Apr 16 18:25:33.162454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162419 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.162552 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162477 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.162619 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162573 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.162669 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162616 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.162669 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.162770 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162718 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fmx4\" (UniqueName: \"kubernetes.io/projected/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kube-api-access-8fmx4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.162824 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162812 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.162876 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162853 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.162981 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.162961 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.163070 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.163050 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.165317 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.165290 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.171097 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.171077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fmx4\" (UniqueName: \"kubernetes.io/projected/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kube-api-access-8fmx4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.219026 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.218970 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" event={"ID":"426804c4-e2c6-453d-85e3-f1aaef3128bd","Type":"ContainerStarted","Data":"43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df"} Apr 16 18:25:33.219026 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.219003 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" event={"ID":"426804c4-e2c6-453d-85e3-f1aaef3128bd","Type":"ContainerStarted","Data":"6bdac39397122fad20ca5aacc54e288f085d4e9bc35b8915ae9f2a671a4a25c6"} Apr 16 18:25:33.281702 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.281679 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:33.411750 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:33.411670 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d"] Apr 16 18:25:33.414739 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:25:33.414710 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa94d68_45f0_44ce_ae6e_a7264eee0ad5.slice/crio-25f4d4749b06403e4ff2aa7e9a52a022af6223ed1a36c190b0e16c45a460248a WatchSource:0}: Error finding container 25f4d4749b06403e4ff2aa7e9a52a022af6223ed1a36c190b0e16c45a460248a: Status 404 returned error can't find the container with id 25f4d4749b06403e4ff2aa7e9a52a022af6223ed1a36c190b0e16c45a460248a Apr 16 18:25:34.225029 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:34.224995 2567 generic.go:358] "Generic (PLEG): container finished" podID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerID="cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe" exitCode=0 Apr 16 18:25:34.225478 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:34.225079 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" event={"ID":"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5","Type":"ContainerDied","Data":"cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe"} Apr 16 18:25:34.225478 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:34.225127 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" event={"ID":"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5","Type":"ContainerStarted","Data":"25f4d4749b06403e4ff2aa7e9a52a022af6223ed1a36c190b0e16c45a460248a"} Apr 16 18:25:35.230965 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:35.230927 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" event={"ID":"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5","Type":"ContainerStarted","Data":"3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c"} Apr 16 18:25:35.230965 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:35.230970 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" event={"ID":"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5","Type":"ContainerStarted","Data":"483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348"} Apr 16 18:25:35.231371 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:35.231074 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:35.251169 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:35.251122 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" podStartSLOduration=3.251104645 podStartE2EDuration="3.251104645s" podCreationTimestamp="2026-04-16 18:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:25:35.25006359 +0000 UTC m=+954.454415438" watchObservedRunningTime="2026-04-16 18:25:35.251104645 +0000 UTC m=+954.455456473" Apr 16 18:25:38.242294 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:38.242253 2567 generic.go:358] "Generic (PLEG): container finished" podID="426804c4-e2c6-453d-85e3-f1aaef3128bd" containerID="43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df" exitCode=0 Apr 16 18:25:38.242709 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:38.242325 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" event={"ID":"426804c4-e2c6-453d-85e3-f1aaef3128bd","Type":"ContainerDied","Data":"43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df"} Apr 16 18:25:38.243363 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:38.243349 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:25:40.251893 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:40.251858 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" event={"ID":"426804c4-e2c6-453d-85e3-f1aaef3128bd","Type":"ContainerStarted","Data":"b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1"} Apr 16 18:25:40.272706 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:40.272655 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" podStartSLOduration=7.103812089 podStartE2EDuration="8.272642316s" podCreationTimestamp="2026-04-16 18:25:32 +0000 UTC" firstStartedPulling="2026-04-16 18:25:38.243468103 +0000 UTC m=+957.447819907" lastFinishedPulling="2026-04-16 18:25:39.412298316 +0000 UTC m=+958.616650134" observedRunningTime="2026-04-16 18:25:40.270910396 +0000 UTC m=+959.475262235" watchObservedRunningTime="2026-04-16 18:25:40.272642316 +0000 UTC m=+959.476994141" Apr 16 18:25:43.015656 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:43.015620 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:43.015999 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:43.015667 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:43.027980 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:43.027956 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:43.272675 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:43.272607 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:25:43.282703 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:43.282681 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:43.282703 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:43.282707 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:43.285301 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:43.285278 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:25:44.265341 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:25:44.265316 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:26:05.268934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:05.268860 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:26:24.768017 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.767988 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn"] Apr 16 18:26:24.771582 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.771547 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.773796 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.773778 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-b2q54\"" Apr 16 18:26:24.773870 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.773857 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 18:26:24.783501 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.783481 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn"] Apr 16 18:26:24.806837 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.806806 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.806957 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.806851 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qnpt\" (UniqueName: \"kubernetes.io/projected/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kube-api-access-6qnpt\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.806957 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.806928 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.807048 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.806995 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.807048 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.807016 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.807121 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.807050 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908121 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908086 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908121 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qnpt\" (UniqueName: \"kubernetes.io/projected/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kube-api-access-6qnpt\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908292 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908292 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908237 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908292 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908427 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908298 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908668 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908647 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908754 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908728 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908754 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908740 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.908860 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.908805 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.910727 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.910703 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:24.920303 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:24.920281 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qnpt\" (UniqueName: \"kubernetes.io/projected/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kube-api-access-6qnpt\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:25.080662 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:25.080568 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:25.206263 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:25.206231 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn"] Apr 16 18:26:25.209940 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:26:25.209915 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f97a7a1_5aaa_4c8d_ab07_d27149817a4e.slice/crio-7e2176b3bfe021ed67c2741a78fa73168bc0cd36d20a363f87e2a7e2bc78d3a2 WatchSource:0}: Error finding container 7e2176b3bfe021ed67c2741a78fa73168bc0cd36d20a363f87e2a7e2bc78d3a2: Status 404 returned error can't find the container with id 7e2176b3bfe021ed67c2741a78fa73168bc0cd36d20a363f87e2a7e2bc78d3a2 Apr 16 18:26:25.394062 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:25.393983 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" event={"ID":"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e","Type":"ContainerStarted","Data":"c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6"} Apr 16 18:26:25.394062 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:25.394018 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" event={"ID":"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e","Type":"ContainerStarted","Data":"7e2176b3bfe021ed67c2741a78fa73168bc0cd36d20a363f87e2a7e2bc78d3a2"} Apr 16 18:26:26.399139 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:26.399105 2567 generic.go:358] "Generic (PLEG): container finished" podID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerID="c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6" exitCode=0 Apr 16 18:26:26.399617 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:26.399196 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" event={"ID":"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e","Type":"ContainerDied","Data":"c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6"} Apr 16 18:26:26.965097 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:26.965065 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d"] Apr 16 18:26:26.965487 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:26.965465 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="main" containerID="cri-o://483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348" gracePeriod=30 Apr 16 18:26:26.965592 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:26.965511 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="tokenizer" containerID="cri-o://3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c" gracePeriod=30 Apr 16 18:26:26.966772 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:26.966749 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959"] Apr 16 18:26:26.967012 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:26.966989 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" podUID="426804c4-e2c6-453d-85e3-f1aaef3128bd" containerName="main" containerID="cri-o://b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1" gracePeriod=30 Apr 16 18:26:27.226473 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.226413 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:26:27.328040 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328009 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-kserve-provision-location\") pod \"426804c4-e2c6-453d-85e3-f1aaef3128bd\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " Apr 16 18:26:27.328223 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328074 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-home\") pod \"426804c4-e2c6-453d-85e3-f1aaef3128bd\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " Apr 16 18:26:27.328223 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328172 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6txs\" (UniqueName: \"kubernetes.io/projected/426804c4-e2c6-453d-85e3-f1aaef3128bd-kube-api-access-t6txs\") pod \"426804c4-e2c6-453d-85e3-f1aaef3128bd\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " Apr 16 18:26:27.328223 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328213 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426804c4-e2c6-453d-85e3-f1aaef3128bd-tls-certs\") pod \"426804c4-e2c6-453d-85e3-f1aaef3128bd\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " Apr 16 18:26:27.328398 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328267 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-dshm\") pod \"426804c4-e2c6-453d-85e3-f1aaef3128bd\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " Apr 16 18:26:27.328398 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328283 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-home" (OuterVolumeSpecName: "home") pod "426804c4-e2c6-453d-85e3-f1aaef3128bd" (UID: "426804c4-e2c6-453d-85e3-f1aaef3128bd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:27.328398 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328298 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-model-cache\") pod \"426804c4-e2c6-453d-85e3-f1aaef3128bd\" (UID: \"426804c4-e2c6-453d-85e3-f1aaef3128bd\") " Apr 16 18:26:27.328605 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328574 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-home\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:27.328830 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.328794 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-model-cache" (OuterVolumeSpecName: "model-cache") pod "426804c4-e2c6-453d-85e3-f1aaef3128bd" (UID: "426804c4-e2c6-453d-85e3-f1aaef3128bd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:27.330319 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.330297 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-dshm" (OuterVolumeSpecName: "dshm") pod "426804c4-e2c6-453d-85e3-f1aaef3128bd" (UID: "426804c4-e2c6-453d-85e3-f1aaef3128bd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:27.330826 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.330804 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426804c4-e2c6-453d-85e3-f1aaef3128bd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "426804c4-e2c6-453d-85e3-f1aaef3128bd" (UID: "426804c4-e2c6-453d-85e3-f1aaef3128bd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:27.330933 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.330831 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426804c4-e2c6-453d-85e3-f1aaef3128bd-kube-api-access-t6txs" (OuterVolumeSpecName: "kube-api-access-t6txs") pod "426804c4-e2c6-453d-85e3-f1aaef3128bd" (UID: "426804c4-e2c6-453d-85e3-f1aaef3128bd"). InnerVolumeSpecName "kube-api-access-t6txs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:27.393265 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.393214 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "426804c4-e2c6-453d-85e3-f1aaef3128bd" (UID: "426804c4-e2c6-453d-85e3-f1aaef3128bd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:27.403981 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.403945 2567 generic.go:358] "Generic (PLEG): container finished" podID="426804c4-e2c6-453d-85e3-f1aaef3128bd" containerID="b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1" exitCode=0 Apr 16 18:26:27.404409 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.403993 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" event={"ID":"426804c4-e2c6-453d-85e3-f1aaef3128bd","Type":"ContainerDied","Data":"b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1"} Apr 16 18:26:27.404409 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.404026 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" Apr 16 18:26:27.404409 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.404053 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959" event={"ID":"426804c4-e2c6-453d-85e3-f1aaef3128bd","Type":"ContainerDied","Data":"6bdac39397122fad20ca5aacc54e288f085d4e9bc35b8915ae9f2a671a4a25c6"} Apr 16 18:26:27.404409 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.404075 2567 scope.go:117] "RemoveContainer" containerID="b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1" Apr 16 18:26:27.406757 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.406695 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" event={"ID":"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e","Type":"ContainerStarted","Data":"ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005"} Apr 16 18:26:27.406757 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.406730 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" event={"ID":"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e","Type":"ContainerStarted","Data":"cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167"} Apr 16 18:26:27.406946 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.406913 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:27.408968 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.408946 2567 generic.go:358] "Generic (PLEG): container finished" podID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerID="483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348" exitCode=0 Apr 16 18:26:27.409097 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.409008 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" event={"ID":"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5","Type":"ContainerDied","Data":"483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348"} Apr 16 18:26:27.415682 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.415532 2567 scope.go:117] "RemoveContainer" containerID="43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df" Apr 16 18:26:27.426952 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.426932 2567 scope.go:117] "RemoveContainer" containerID="b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1" Apr 16 18:26:27.427227 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:26:27.427202 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1\": container with ID starting with b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1 not found: ID does not exist" containerID="b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1" Apr 16 18:26:27.427286 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.427239 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1"} err="failed to get container status \"b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1\": rpc error: code = NotFound desc = could not find container \"b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1\": container with ID starting with b9edda2815bff264e5a29cb957237316fb3ed681339594799bfccd0003f611c1 not found: ID does not exist" Apr 16 18:26:27.427286 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.427264 2567 scope.go:117] "RemoveContainer" containerID="43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df" Apr 16 18:26:27.427516 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:26:27.427494 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df\": container with ID starting with 43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df not found: ID does not exist" containerID="43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df" Apr 16 18:26:27.427588 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.427523 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df"} err="failed to get container status \"43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df\": rpc error: code = NotFound desc = could not find container \"43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df\": container with ID starting with 43566d90bdca4030bbe27b91a3043dd808aaf51c641f4008476ac1198e02d8df not found: ID does not exist" Apr 16 18:26:27.432163 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.432133 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6txs\" (UniqueName: \"kubernetes.io/projected/426804c4-e2c6-453d-85e3-f1aaef3128bd-kube-api-access-t6txs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:27.432273 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.432177 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426804c4-e2c6-453d-85e3-f1aaef3128bd-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:27.432273 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.432194 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-dshm\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:27.432273 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.432208 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-model-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:27.432273 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.432231 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426804c4-e2c6-453d-85e3-f1aaef3128bd-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:27.432985 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.432829 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" podStartSLOduration=3.432810718 podStartE2EDuration="3.432810718s" podCreationTimestamp="2026-04-16 18:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:26:27.428588201 +0000 UTC m=+1006.632940028" watchObservedRunningTime="2026-04-16 18:26:27.432810718 +0000 UTC m=+1006.637162550" Apr 16 18:26:27.445697 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.445665 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959"] Apr 16 18:26:27.451225 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:27.451202 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-769474c8cf-rm959"] Apr 16 18:26:28.225860 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.225839 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:26:28.339609 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.339580 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kserve-provision-location\") pod \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " Apr 16 18:26:28.339750 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.339689 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-cache\") pod \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " Apr 16 18:26:28.339750 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.339741 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fmx4\" (UniqueName: \"kubernetes.io/projected/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kube-api-access-8fmx4\") pod \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " Apr 16 18:26:28.339869 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.339772 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-tmp\") pod \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " Apr 16 18:26:28.339869 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.339802 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tls-certs\") pod \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " Apr 16 18:26:28.339869 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.339835 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-uds\") pod \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\" (UID: \"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5\") " Apr 16 18:26:28.340037 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.339920 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" (UID: "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:28.340115 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.340091 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" (UID: "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:28.340163 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.340118 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:28.340210 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.340143 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" (UID: "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:28.340314 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.340293 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" (UID: "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:28.341877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.341851 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" (UID: "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:28.341877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.341874 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kube-api-access-8fmx4" (OuterVolumeSpecName: "kube-api-access-8fmx4") pod "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" (UID: "0aa94d68-45f0-44ce-ae6e-a7264eee0ad5"). InnerVolumeSpecName "kube-api-access-8fmx4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:28.414674 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.414646 2567 generic.go:358] "Generic (PLEG): container finished" podID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerID="3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c" exitCode=0 Apr 16 18:26:28.415067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.414742 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" Apr 16 18:26:28.415067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.414755 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" event={"ID":"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5","Type":"ContainerDied","Data":"3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c"} Apr 16 18:26:28.415067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.414790 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d" event={"ID":"0aa94d68-45f0-44ce-ae6e-a7264eee0ad5","Type":"ContainerDied","Data":"25f4d4749b06403e4ff2aa7e9a52a022af6223ed1a36c190b0e16c45a460248a"} Apr 16 18:26:28.415067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.414808 2567 scope.go:117] "RemoveContainer" containerID="3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c" Apr 16 18:26:28.423095 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.423077 2567 scope.go:117] "RemoveContainer" containerID="483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348" Apr 16 18:26:28.430398 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.430380 2567 scope.go:117] "RemoveContainer" containerID="cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe" Apr 16 18:26:28.437357 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.437337 2567 scope.go:117] "RemoveContainer" containerID="3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c" Apr 16 18:26:28.437620 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:26:28.437602 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c\": container with ID starting with 3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c not found: ID does not exist" containerID="3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c" Apr 16 18:26:28.437681 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.437628 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c"} err="failed to get container status \"3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c\": rpc error: code = NotFound desc = could not find container \"3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c\": container with ID starting with 3547ce7439f2f6d427f976c0d39cf3130ce35e4fb0f2487b358de2a59605080c not found: ID does not exist" Apr 16 18:26:28.437681 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.437646 2567 scope.go:117] "RemoveContainer" containerID="483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348" Apr 16 18:26:28.437848 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:26:28.437830 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348\": container with ID starting with 483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348 not found: ID does not exist" containerID="483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348" Apr 16 18:26:28.437904 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.437856 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348"} err="failed to get container status \"483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348\": rpc error: code = NotFound desc = could not find container \"483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348\": container with ID starting with 483741b0b1d2865d41a908426ddc550e0d814f18e6720e94d3ae214bcbed1348 not found: ID does not exist" Apr 16 18:26:28.437904 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.437878 2567 scope.go:117] "RemoveContainer" containerID="cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe" Apr 16 18:26:28.438119 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:26:28.438104 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe\": container with ID starting with cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe not found: ID does not exist" containerID="cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe" Apr 16 18:26:28.438156 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.438123 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe"} err="failed to get container status \"cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe\": rpc error: code = NotFound desc = could not find container \"cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe\": container with ID starting with cb1f7ed084a2a9ecd0c32274976163b2d5e5d2cc861c614d307c7f192c3b5afe not found: ID does not exist" Apr 16 18:26:28.441014 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.440991 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d"] Apr 16 18:26:28.441471 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.441448 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:28.441518 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.441481 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8fmx4\" (UniqueName: \"kubernetes.io/projected/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-kube-api-access-8fmx4\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:28.441518 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.441497 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-tmp\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:28.441518 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.441511 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:28.441649 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.441522 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5-tokenizer-uds\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:28.446139 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:28.446117 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-68f6b96gvv4d"] Apr 16 18:26:29.310532 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:29.310501 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" path="/var/lib/kubelet/pods/0aa94d68-45f0-44ce-ae6e-a7264eee0ad5/volumes" Apr 16 18:26:29.310982 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:29.310967 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426804c4-e2c6-453d-85e3-f1aaef3128bd" path="/var/lib/kubelet/pods/426804c4-e2c6-453d-85e3-f1aaef3128bd/volumes" Apr 16 18:26:35.080822 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:35.080792 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:35.080822 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:35.080830 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:35.082033 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:26:35.082004 2567 logging.go:55] [core] [Channel #65 SubChannel #66]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.27:9003", ServerName: "10.134.0.27:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.27:9003: connect: connection refused" Apr 16 18:26:35.083296 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:35.083275 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:35.438549 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:35.438475 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:36.081199 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:36.081160 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.27:9003\" within 1s: context deadline exceeded" Apr 16 18:26:45.081827 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:26:45.081801 2567 logging.go:55] [core] [Channel #67 SubChannel #68]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.27:9003", ServerName: "10.134.0.27:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.27:9003: connect: connection refused" Apr 16 18:26:46.082533 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:46.082490 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.27:9003\" within 1s: context deadline exceeded" Apr 16 18:26:56.441519 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:56.441481 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:57.579686 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:57.579654 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn"] Apr 16 18:26:57.580130 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:57.579961 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="main" containerID="cri-o://cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167" gracePeriod=30 Apr 16 18:26:57.580130 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:57.580029 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="tokenizer" containerID="cri-o://ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005" gracePeriod=30 Apr 16 18:26:58.515395 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:58.515362 2567 generic.go:358] "Generic (PLEG): container finished" podID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerID="cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167" exitCode=0 Apr 16 18:26:58.515613 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:58.515427 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" event={"ID":"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e","Type":"ContainerDied","Data":"cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167"} Apr 16 18:26:58.928248 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:58.928227 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:59.091597 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.091575 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kserve-provision-location\") pod \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " Apr 16 18:26:59.091716 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.091602 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-cache\") pod \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " Apr 16 18:26:59.091716 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.091622 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-uds\") pod \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " Apr 16 18:26:59.091716 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.091640 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qnpt\" (UniqueName: \"kubernetes.io/projected/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kube-api-access-6qnpt\") pod \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " Apr 16 18:26:59.091716 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.091694 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-tmp\") pod \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " Apr 16 18:26:59.091932 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.091723 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tls-certs\") pod \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\" (UID: \"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e\") " Apr 16 18:26:59.091988 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.091940 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" (UID: "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:59.092061 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.092033 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" (UID: "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:59.092061 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.092047 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" (UID: "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:59.092335 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.092314 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" (UID: "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:59.093784 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.093755 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" (UID: "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:59.093898 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.093856 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kube-api-access-6qnpt" (OuterVolumeSpecName: "kube-api-access-6qnpt") pod "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" (UID: "7f97a7a1-5aaa-4c8d-ab07-d27149817a4e"). InnerVolumeSpecName "kube-api-access-6qnpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:59.192426 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.192400 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-tmp\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.192514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.192431 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.192514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.192442 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.192514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.192451 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.192514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.192461 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-tokenizer-uds\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.192514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.192470 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6qnpt\" (UniqueName: \"kubernetes.io/projected/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e-kube-api-access-6qnpt\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.519989 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.519923 2567 generic.go:358] "Generic (PLEG): container finished" podID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerID="ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005" exitCode=0 Apr 16 18:26:59.520086 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.520013 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" Apr 16 18:26:59.520135 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.520010 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" event={"ID":"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e","Type":"ContainerDied","Data":"ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005"} Apr 16 18:26:59.520135 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.520112 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn" event={"ID":"7f97a7a1-5aaa-4c8d-ab07-d27149817a4e","Type":"ContainerDied","Data":"7e2176b3bfe021ed67c2741a78fa73168bc0cd36d20a363f87e2a7e2bc78d3a2"} Apr 16 18:26:59.520135 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.520129 2567 scope.go:117] "RemoveContainer" containerID="ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005" Apr 16 18:26:59.527502 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.527483 2567 scope.go:117] "RemoveContainer" containerID="cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167" Apr 16 18:26:59.534085 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.534068 2567 scope.go:117] "RemoveContainer" containerID="c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6" Apr 16 18:26:59.538424 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.538400 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn"] Apr 16 18:26:59.544513 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.544492 2567 scope.go:117] "RemoveContainer" containerID="ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005" Apr 16 18:26:59.544781 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:26:59.544764 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005\": container with ID starting with ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005 not found: ID does not exist" containerID="ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005" Apr 16 18:26:59.544845 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.544789 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005"} err="failed to get container status \"ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005\": rpc error: code = NotFound desc = could not find container \"ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005\": container with ID starting with ec4f569a437e7a366666ae0b61c61f2b7e9d7c16f7f95501fa13f43067fd1005 not found: ID does not exist" Apr 16 18:26:59.544845 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.544805 2567 scope.go:117] "RemoveContainer" containerID="cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167" Apr 16 18:26:59.545037 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:26:59.545017 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167\": container with ID starting with cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167 not found: ID does not exist" containerID="cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167" Apr 16 18:26:59.545102 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.545047 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167"} err="failed to get container status \"cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167\": rpc error: code = NotFound desc = could not find container \"cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167\": container with ID starting with cc64c16d54c919c7dfe9c0dcb8a30e97246b59d3b9463f92c7d4d4c224668167 not found: ID does not exist" Apr 16 18:26:59.545102 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.545070 2567 scope.go:117] "RemoveContainer" containerID="c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6" Apr 16 18:26:59.545294 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.545276 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76c6b744bhsdn"] Apr 16 18:26:59.545347 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:26:59.545297 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6\": container with ID starting with c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6 not found: ID does not exist" containerID="c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6" Apr 16 18:26:59.545347 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:26:59.545319 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6"} err="failed to get container status \"c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6\": rpc error: code = NotFound desc = could not find container \"c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6\": container with ID starting with c9daa660ebb980450a33907d895d0596cf4b69c030eb45540b6ef4596b1d83e6 not found: ID does not exist" Apr 16 18:27:01.310862 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:01.310827 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" path="/var/lib/kubelet/pods/7f97a7a1-5aaa-4c8d-ab07-d27149817a4e/volumes" Apr 16 18:27:38.556047 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.555974 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm"] Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556318 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="storage-initializer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556331 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="storage-initializer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556345 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="storage-initializer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556350 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="storage-initializer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556359 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="main" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556364 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="main" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556371 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="main" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556377 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="main" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556384 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="tokenizer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556389 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="tokenizer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556394 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="426804c4-e2c6-453d-85e3-f1aaef3128bd" containerName="main" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556399 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="426804c4-e2c6-453d-85e3-f1aaef3128bd" containerName="main" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556404 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="tokenizer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556409 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="tokenizer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556422 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="426804c4-e2c6-453d-85e3-f1aaef3128bd" containerName="storage-initializer" Apr 16 18:27:38.556449 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556427 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="426804c4-e2c6-453d-85e3-f1aaef3128bd" containerName="storage-initializer" Apr 16 18:27:38.557067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556484 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="tokenizer" Apr 16 18:27:38.557067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556493 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="426804c4-e2c6-453d-85e3-f1aaef3128bd" containerName="main" Apr 16 18:27:38.557067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556499 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="main" Apr 16 18:27:38.557067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556506 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f97a7a1-5aaa-4c8d-ab07-d27149817a4e" containerName="tokenizer" Apr 16 18:27:38.557067 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.556512 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0aa94d68-45f0-44ce-ae6e-a7264eee0ad5" containerName="main" Apr 16 18:27:38.559825 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.559808 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.562245 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.562223 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-7hhcz\"" Apr 16 18:27:38.563084 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.563061 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:27:38.563201 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.563070 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 18:27:38.570859 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.570839 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm"] Apr 16 18:27:38.673641 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.673615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.673751 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.673646 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.673751 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.673673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.673841 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.673774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.673841 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.673813 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz24d\" (UniqueName: \"kubernetes.io/projected/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kube-api-access-xz24d\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.673921 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.673841 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774329 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774304 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774425 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774425 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774385 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774425 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774419 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774530 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774486 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774604 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774529 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz24d\" (UniqueName: \"kubernetes.io/projected/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kube-api-access-xz24d\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774794 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774775 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774876 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774813 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774876 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774847 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.774955 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.774879 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.776951 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.776930 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.783216 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.783185 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz24d\" (UniqueName: \"kubernetes.io/projected/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kube-api-access-xz24d\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.869237 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.869218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:38.995204 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:38.995177 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm"] Apr 16 18:27:38.997445 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:27:38.997419 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2792a5e7_b3ac_47cc_b0e0_b0c1f286f81a.slice/crio-2a837501c4f6d337eb412c9a36adfc782f223a355734a8a9a6a97ed22346b8d0 WatchSource:0}: Error finding container 2a837501c4f6d337eb412c9a36adfc782f223a355734a8a9a6a97ed22346b8d0: Status 404 returned error can't find the container with id 2a837501c4f6d337eb412c9a36adfc782f223a355734a8a9a6a97ed22346b8d0 Apr 16 18:27:39.651715 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:39.651676 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" event={"ID":"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a","Type":"ContainerStarted","Data":"dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503"} Apr 16 18:27:39.651715 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:39.651716 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" event={"ID":"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a","Type":"ContainerStarted","Data":"2a837501c4f6d337eb412c9a36adfc782f223a355734a8a9a6a97ed22346b8d0"} Apr 16 18:27:40.656395 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:40.656367 2567 generic.go:358] "Generic (PLEG): container finished" podID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerID="dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503" exitCode=0 Apr 16 18:27:40.656849 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:40.656401 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" event={"ID":"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a","Type":"ContainerDied","Data":"dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503"} Apr 16 18:27:41.661458 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:41.661423 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" event={"ID":"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a","Type":"ContainerStarted","Data":"24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed"} Apr 16 18:27:41.661458 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:41.661460 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" event={"ID":"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a","Type":"ContainerStarted","Data":"36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2"} Apr 16 18:27:41.662009 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:41.661599 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:41.683119 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:41.683055 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" podStartSLOduration=3.683039175 podStartE2EDuration="3.683039175s" podCreationTimestamp="2026-04-16 18:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:27:41.681644207 +0000 UTC m=+1080.885996035" watchObservedRunningTime="2026-04-16 18:27:41.683039175 +0000 UTC m=+1080.887391000" Apr 16 18:27:48.869776 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:48.869739 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:48.870343 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:48.869790 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:48.872384 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:48.872354 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:27:49.689942 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:27:49.689915 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:28:10.693522 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:28:10.693491 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:29:50.859261 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:50.859223 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm"] Apr 16 18:29:50.859786 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:50.859663 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="main" containerID="cri-o://36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2" gracePeriod=30 Apr 16 18:29:50.859786 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:50.859700 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="tokenizer" containerID="cri-o://24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed" gracePeriod=30 Apr 16 18:29:51.098007 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:51.097970 2567 generic.go:358] "Generic (PLEG): container finished" podID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerID="36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2" exitCode=0 Apr 16 18:29:51.098201 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:51.098043 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" event={"ID":"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a","Type":"ContainerDied","Data":"36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2"} Apr 16 18:29:52.003248 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.003227 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:29:52.102375 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.102352 2567 generic.go:358] "Generic (PLEG): container finished" podID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerID="24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed" exitCode=0 Apr 16 18:29:52.102510 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.102474 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" event={"ID":"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a","Type":"ContainerDied","Data":"24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed"} Apr 16 18:29:52.102510 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.102484 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" Apr 16 18:29:52.102664 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.102512 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm" event={"ID":"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a","Type":"ContainerDied","Data":"2a837501c4f6d337eb412c9a36adfc782f223a355734a8a9a6a97ed22346b8d0"} Apr 16 18:29:52.102664 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.102534 2567 scope.go:117] "RemoveContainer" containerID="24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed" Apr 16 18:29:52.110182 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.110165 2567 scope.go:117] "RemoveContainer" containerID="36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2" Apr 16 18:29:52.117086 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.117071 2567 scope.go:117] "RemoveContainer" containerID="dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503" Apr 16 18:29:52.123955 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.123939 2567 scope.go:117] "RemoveContainer" containerID="24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed" Apr 16 18:29:52.124199 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:29:52.124182 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed\": container with ID starting with 24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed not found: ID does not exist" containerID="24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed" Apr 16 18:29:52.124249 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.124206 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed"} err="failed to get container status \"24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed\": rpc error: code = NotFound desc = could not find container \"24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed\": container with ID starting with 24cf60aeda562e4f2bd4d8d62a1b8800c58cec77c93c3f98ce49b6e9c41601ed not found: ID does not exist" Apr 16 18:29:52.124249 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.124225 2567 scope.go:117] "RemoveContainer" containerID="36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2" Apr 16 18:29:52.124478 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:29:52.124435 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2\": container with ID starting with 36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2 not found: ID does not exist" containerID="36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2" Apr 16 18:29:52.124520 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.124485 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2"} err="failed to get container status \"36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2\": rpc error: code = NotFound desc = could not find container \"36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2\": container with ID starting with 36572c1e47dcde2e96dfad7d3c93490df197ef131261bc4ef8d503b9d6c978d2 not found: ID does not exist" Apr 16 18:29:52.124520 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.124504 2567 scope.go:117] "RemoveContainer" containerID="dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503" Apr 16 18:29:52.124733 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:29:52.124717 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503\": container with ID starting with dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503 not found: ID does not exist" containerID="dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503" Apr 16 18:29:52.124781 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.124738 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503"} err="failed to get container status \"dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503\": rpc error: code = NotFound desc = could not find container \"dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503\": container with ID starting with dd850064475b9fba37be65f8a0d6bb3c0b8f1a448e5b71fef7e1869c8cb56503 not found: ID does not exist" Apr 16 18:29:52.146412 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146393 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-tmp\") pod \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " Apr 16 18:29:52.146485 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146431 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tls-certs\") pod \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " Apr 16 18:29:52.146485 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146463 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kserve-provision-location\") pod \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " Apr 16 18:29:52.146485 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146483 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-uds\") pod \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " Apr 16 18:29:52.146621 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146510 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-cache\") pod \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " Apr 16 18:29:52.146621 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146612 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz24d\" (UniqueName: \"kubernetes.io/projected/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kube-api-access-xz24d\") pod \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\" (UID: \"2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a\") " Apr 16 18:29:52.146751 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146723 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" (UID: "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:29:52.146831 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146734 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" (UID: "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:29:52.146831 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146817 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" (UID: "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:29:52.146941 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146918 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-tmp\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:29:52.146941 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146930 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-uds\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:29:52.146941 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.146938 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tokenizer-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:29:52.147214 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.147190 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" (UID: "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:29:52.148377 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.148352 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" (UID: "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:29:52.148622 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.148599 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kube-api-access-xz24d" (OuterVolumeSpecName: "kube-api-access-xz24d") pod "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" (UID: "2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a"). InnerVolumeSpecName "kube-api-access-xz24d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:29:52.247816 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.247797 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xz24d\" (UniqueName: \"kubernetes.io/projected/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kube-api-access-xz24d\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:29:52.247816 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.247818 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:29:52.247939 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.247828 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:29:52.438466 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.438443 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm"] Apr 16 18:29:52.470665 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:52.470644 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-d2bpm"] Apr 16 18:29:53.310783 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:29:53.310752 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" path="/var/lib/kubelet/pods/2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a/volumes" Apr 16 18:30:09.597054 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597020 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s"] Apr 16 18:30:09.597475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597366 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="storage-initializer" Apr 16 18:30:09.597475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597378 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="storage-initializer" Apr 16 18:30:09.597475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597388 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="tokenizer" Apr 16 18:30:09.597475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597393 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="tokenizer" Apr 16 18:30:09.597475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597406 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="main" Apr 16 18:30:09.597475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597411 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="main" Apr 16 18:30:09.597475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597461 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="tokenizer" Apr 16 18:30:09.597475 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.597471 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2792a5e7-b3ac-47cc-b0e0-b0c1f286f81a" containerName="main" Apr 16 18:30:09.600503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.600484 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.603543 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.603521 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-gv9h6\"" Apr 16 18:30:09.603655 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.603528 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:30:09.603655 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.603575 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 18:30:09.611311 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.611290 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s"] Apr 16 18:30:09.679506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.679480 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.679629 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.679516 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.679629 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.679537 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.679629 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.679552 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.679629 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.679627 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.679769 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.679652 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2f5\" (UniqueName: \"kubernetes.io/projected/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kube-api-access-kh2f5\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.780830 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.780807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.780929 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.780843 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.780929 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.780866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.780929 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.780888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.781059 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.780938 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.781059 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.780977 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2f5\" (UniqueName: \"kubernetes.io/projected/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kube-api-access-kh2f5\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.781245 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.781226 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.781315 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.781292 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.781420 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.781386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.781514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.781488 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.783211 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.783195 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.792190 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.792166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2f5\" (UniqueName: \"kubernetes.io/projected/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kube-api-access-kh2f5\") pod \"stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:09.910831 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:09.910763 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:10.037002 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:10.036972 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s"] Apr 16 18:30:10.039021 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:30:10.038994 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fe4e303_f8ad_4b2d_a3d1_1c5482f577a6.slice/crio-ef6882a6fe1ab5b249172d233062e58ee988e892e4b11ab75a5cb37b8d2f4824 WatchSource:0}: Error finding container ef6882a6fe1ab5b249172d233062e58ee988e892e4b11ab75a5cb37b8d2f4824: Status 404 returned error can't find the container with id ef6882a6fe1ab5b249172d233062e58ee988e892e4b11ab75a5cb37b8d2f4824 Apr 16 18:30:10.165955 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:10.165883 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" event={"ID":"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6","Type":"ContainerStarted","Data":"14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681"} Apr 16 18:30:10.165955 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:10.165918 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" event={"ID":"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6","Type":"ContainerStarted","Data":"ef6882a6fe1ab5b249172d233062e58ee988e892e4b11ab75a5cb37b8d2f4824"} Apr 16 18:30:11.169965 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:11.169937 2567 generic.go:358] "Generic (PLEG): container finished" podID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerID="14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681" exitCode=0 Apr 16 18:30:11.170303 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:11.170005 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" event={"ID":"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6","Type":"ContainerDied","Data":"14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681"} Apr 16 18:30:12.175136 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:12.175097 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" event={"ID":"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6","Type":"ContainerStarted","Data":"38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138"} Apr 16 18:30:12.175136 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:12.175132 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" event={"ID":"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6","Type":"ContainerStarted","Data":"c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8"} Apr 16 18:30:12.175540 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:12.175236 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:12.203654 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:12.203602 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" podStartSLOduration=3.203584269 podStartE2EDuration="3.203584269s" podCreationTimestamp="2026-04-16 18:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:12.200882857 +0000 UTC m=+1231.405234687" watchObservedRunningTime="2026-04-16 18:30:12.203584269 +0000 UTC m=+1231.407936096" Apr 16 18:30:19.910902 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:19.910866 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:19.910902 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:19.910910 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:19.913543 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:19.913520 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:20.201505 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:20.201428 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:30:41.206092 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:30:41.206021 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:32:01.206442 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:01.206353 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:32:01.250174 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:01.250139 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s"] Apr 16 18:32:01.251816 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:01.251779 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="main" containerID="cri-o://c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8" gracePeriod=30 Apr 16 18:32:01.252862 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:01.252473 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="tokenizer" containerID="cri-o://38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138" gracePeriod=30 Apr 16 18:32:01.535170 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:01.535087 2567 generic.go:358] "Generic (PLEG): container finished" podID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerID="c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8" exitCode=0 Apr 16 18:32:01.535312 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:01.535167 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" event={"ID":"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6","Type":"ContainerDied","Data":"c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8"} Apr 16 18:32:02.402864 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.402840 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:32:02.535698 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.535628 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-uds\") pod \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " Apr 16 18:32:02.535698 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.535691 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tls-certs\") pod \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " Apr 16 18:32:02.535866 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.535721 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh2f5\" (UniqueName: \"kubernetes.io/projected/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kube-api-access-kh2f5\") pod \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " Apr 16 18:32:02.535866 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.535785 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-tmp\") pod \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " Apr 16 18:32:02.535866 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.535804 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-cache\") pod \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " Apr 16 18:32:02.535866 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.535827 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kserve-provision-location\") pod \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\" (UID: \"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6\") " Apr 16 18:32:02.536077 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.535931 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" (UID: "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:02.536160 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.536106 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" (UID: "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:02.536160 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.536142 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-uds\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:32:02.536289 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.536156 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" (UID: "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:02.536641 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.536619 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" (UID: "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:02.538282 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.538255 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" (UID: "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:02.538382 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.538276 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kube-api-access-kh2f5" (OuterVolumeSpecName: "kube-api-access-kh2f5") pod "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" (UID: "1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6"). InnerVolumeSpecName "kube-api-access-kh2f5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:32:02.542902 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.542880 2567 generic.go:358] "Generic (PLEG): container finished" podID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerID="38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138" exitCode=0 Apr 16 18:32:02.542985 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.542931 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" event={"ID":"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6","Type":"ContainerDied","Data":"38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138"} Apr 16 18:32:02.542985 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.542964 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" event={"ID":"1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6","Type":"ContainerDied","Data":"ef6882a6fe1ab5b249172d233062e58ee988e892e4b11ab75a5cb37b8d2f4824"} Apr 16 18:32:02.542985 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.542968 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s" Apr 16 18:32:02.542985 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.542981 2567 scope.go:117] "RemoveContainer" containerID="38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138" Apr 16 18:32:02.557602 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.557586 2567 scope.go:117] "RemoveContainer" containerID="c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8" Apr 16 18:32:02.564263 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.564247 2567 scope.go:117] "RemoveContainer" containerID="14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681" Apr 16 18:32:02.570418 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.570396 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s"] Apr 16 18:32:02.571672 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.571654 2567 scope.go:117] "RemoveContainer" containerID="38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138" Apr 16 18:32:02.571914 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:32:02.571895 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138\": container with ID starting with 38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138 not found: ID does not exist" containerID="38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138" Apr 16 18:32:02.571977 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.571922 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138"} err="failed to get container status \"38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138\": rpc error: code = NotFound desc = could not find container \"38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138\": container with ID starting with 38f9e28deb084b24a2e36e7e44a6b32e704cbc4a9b414236eceb546df4f46138 not found: ID does not exist" Apr 16 18:32:02.571977 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.571940 2567 scope.go:117] "RemoveContainer" containerID="c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8" Apr 16 18:32:02.572140 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:32:02.572123 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8\": container with ID starting with c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8 not found: ID does not exist" containerID="c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8" Apr 16 18:32:02.572179 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.572146 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8"} err="failed to get container status \"c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8\": rpc error: code = NotFound desc = could not find container \"c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8\": container with ID starting with c75a655508964ff18821230288fc49dcb83287bc75710dbe8a59367193839bc8 not found: ID does not exist" Apr 16 18:32:02.572179 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.572160 2567 scope.go:117] "RemoveContainer" containerID="14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681" Apr 16 18:32:02.572371 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:32:02.572356 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681\": container with ID starting with 14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681 not found: ID does not exist" containerID="14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681" Apr 16 18:32:02.572422 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.572375 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681"} err="failed to get container status \"14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681\": rpc error: code = NotFound desc = could not find container \"14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681\": container with ID starting with 14934e54d2d6f8fe9a3e9cc5a03df0c400559e6299be17794607ab8fe3a55681 not found: ID does not exist" Apr 16 18:32:02.575614 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.575595 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6459b6cc4b-sws5s"] Apr 16 18:32:02.637541 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.637515 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:32:02.637648 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.637540 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kh2f5\" (UniqueName: \"kubernetes.io/projected/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kube-api-access-kh2f5\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:32:02.637648 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.637580 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-tmp\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:32:02.637648 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.637594 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-tokenizer-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:32:02.637648 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.637609 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:32:02.747768 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.747740 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv"] Apr 16 18:32:02.748079 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.748065 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="tokenizer" Apr 16 18:32:02.748126 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.748081 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="tokenizer" Apr 16 18:32:02.748126 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.748092 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="storage-initializer" Apr 16 18:32:02.748126 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.748097 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="storage-initializer" Apr 16 18:32:02.748126 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.748124 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="main" Apr 16 18:32:02.748251 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.748130 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="main" Apr 16 18:32:02.748251 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.748178 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="main" Apr 16 18:32:02.748251 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.748188 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" containerName="tokenizer" Apr 16 18:32:02.752503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.752485 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:02.757702 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.757655 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:32:02.757702 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.757678 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:32:02.757837 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.757820 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:32:02.759349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.759203 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-dnphx\"" Apr 16 18:32:02.761148 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.761127 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv"] Apr 16 18:32:02.839596 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.839573 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxp6m\" (UniqueName: \"kubernetes.io/projected/7f3eb740-370e-4df5-a971-d328bf1a09c4-kube-api-access-pxp6m\") pod \"llmisvc-controller-manager-6c5b6b4855-bsvqv\" (UID: \"7f3eb740-370e-4df5-a971-d328bf1a09c4\") " pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:02.839695 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.839629 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f3eb740-370e-4df5-a971-d328bf1a09c4-cert\") pod \"llmisvc-controller-manager-6c5b6b4855-bsvqv\" (UID: \"7f3eb740-370e-4df5-a971-d328bf1a09c4\") " pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:02.940826 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.940796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f3eb740-370e-4df5-a971-d328bf1a09c4-cert\") pod \"llmisvc-controller-manager-6c5b6b4855-bsvqv\" (UID: \"7f3eb740-370e-4df5-a971-d328bf1a09c4\") " pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:02.940934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.940852 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxp6m\" (UniqueName: \"kubernetes.io/projected/7f3eb740-370e-4df5-a971-d328bf1a09c4-kube-api-access-pxp6m\") pod \"llmisvc-controller-manager-6c5b6b4855-bsvqv\" (UID: \"7f3eb740-370e-4df5-a971-d328bf1a09c4\") " pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:02.943155 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.943132 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f3eb740-370e-4df5-a971-d328bf1a09c4-cert\") pod \"llmisvc-controller-manager-6c5b6b4855-bsvqv\" (UID: \"7f3eb740-370e-4df5-a971-d328bf1a09c4\") " pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:02.960097 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:02.960070 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxp6m\" (UniqueName: \"kubernetes.io/projected/7f3eb740-370e-4df5-a971-d328bf1a09c4-kube-api-access-pxp6m\") pod \"llmisvc-controller-manager-6c5b6b4855-bsvqv\" (UID: \"7f3eb740-370e-4df5-a971-d328bf1a09c4\") " pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:03.062420 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:03.062395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:03.177131 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:03.177082 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv"] Apr 16 18:32:03.180763 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:32:03.180741 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7f3eb740_370e_4df5_a971_d328bf1a09c4.slice/crio-aff89ca8d4f769dab077b636f5a8d5993993b462a2b4ae670899ad984a4b2df2 WatchSource:0}: Error finding container aff89ca8d4f769dab077b636f5a8d5993993b462a2b4ae670899ad984a4b2df2: Status 404 returned error can't find the container with id aff89ca8d4f769dab077b636f5a8d5993993b462a2b4ae670899ad984a4b2df2 Apr 16 18:32:03.182174 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:03.182157 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:32:03.310438 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:03.310413 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6" path="/var/lib/kubelet/pods/1fe4e303-f8ad-4b2d-a3d1-1c5482f577a6/volumes" Apr 16 18:32:03.547282 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:03.547198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" event={"ID":"7f3eb740-370e-4df5-a971-d328bf1a09c4","Type":"ContainerStarted","Data":"aff89ca8d4f769dab077b636f5a8d5993993b462a2b4ae670899ad984a4b2df2"} Apr 16 18:32:06.559806 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:06.559775 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" event={"ID":"7f3eb740-370e-4df5-a971-d328bf1a09c4","Type":"ContainerStarted","Data":"39acf85d3c02bb0ac9ca16dd5ce7bf0fd0d5efd5d09b5cbe775d7a44ba23401a"} Apr 16 18:32:06.560170 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:06.559871 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:32:06.577401 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:06.577352 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" podStartSLOduration=1.403644438 podStartE2EDuration="4.577337968s" podCreationTimestamp="2026-04-16 18:32:02 +0000 UTC" firstStartedPulling="2026-04-16 18:32:03.182275694 +0000 UTC m=+1342.386627498" lastFinishedPulling="2026-04-16 18:32:06.355969216 +0000 UTC m=+1345.560321028" observedRunningTime="2026-04-16 18:32:06.576303979 +0000 UTC m=+1345.780655806" watchObservedRunningTime="2026-04-16 18:32:06.577337968 +0000 UTC m=+1345.781689841" Apr 16 18:32:37.565693 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:32:37.565665 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6c5b6b4855-bsvqv" Apr 16 18:37:07.645007 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.644971 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:37:07.648739 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.648713 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.651988 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.651963 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 18:37:07.652157 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.652019 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:37:07.652157 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.652040 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-6czxt\"" Apr 16 18:37:07.660528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.660505 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:37:07.725228 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.725196 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9"] Apr 16 18:37:07.726620 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.726590 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.726760 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.726706 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.726861 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.726830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.726990 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.726881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.727056 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.726991 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.727056 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.727026 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwr54\" (UniqueName: \"kubernetes.io/projected/15905fe6-01d9-43a8-8a88-9020b10729d0-kube-api-access-wwr54\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.729076 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.729059 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.731938 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.731919 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-tmk7w\"" Apr 16 18:37:07.745033 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.745013 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9"] Apr 16 18:37:07.827953 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.827918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.828141 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.827989 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.828141 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828016 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.828141 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828058 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.828141 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828100 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwr54\" (UniqueName: \"kubernetes.io/projected/15905fe6-01d9-43a8-8a88-9020b10729d0-kube-api-access-wwr54\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.828361 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828176 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.828361 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828224 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.828361 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.828361 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhnsf\" (UniqueName: \"kubernetes.io/projected/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kube-api-access-dhnsf\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.828361 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828352 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.828599 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828409 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.828599 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828440 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.828599 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828462 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.828712 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828689 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.828750 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.828696 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.830282 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.830261 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.830459 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.830443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.839811 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.839780 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwr54\" (UniqueName: \"kubernetes.io/projected/15905fe6-01d9-43a8-8a88-9020b10729d0-kube-api-access-wwr54\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:07.929883 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.929799 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhnsf\" (UniqueName: \"kubernetes.io/projected/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kube-api-access-dhnsf\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.929883 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.929838 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.930116 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.929897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.930116 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.929914 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.930116 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.929949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.930116 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.929979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.930346 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.930320 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.930346 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.930339 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.930427 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.930382 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.930427 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.930400 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.932432 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.932402 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.939356 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.939327 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhnsf\" (UniqueName: \"kubernetes.io/projected/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kube-api-access-dhnsf\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:07.960130 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:07.960105 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:37:08.039301 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:08.038791 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:08.092851 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:08.092817 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:37:08.094898 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:37:08.094873 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15905fe6_01d9_43a8_8a88_9020b10729d0.slice/crio-19918820fe875f0a5002620e80b6c6e38d4d775c9d306879eac24971cecc9219 WatchSource:0}: Error finding container 19918820fe875f0a5002620e80b6c6e38d4d775c9d306879eac24971cecc9219: Status 404 returned error can't find the container with id 19918820fe875f0a5002620e80b6c6e38d4d775c9d306879eac24971cecc9219 Apr 16 18:37:08.097399 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:08.097027 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:37:08.172157 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:08.172130 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9"] Apr 16 18:37:08.174861 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:37:08.174837 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd50a4326_bccb_45f4_a5aa_be480ba2a1a4.slice/crio-a459e59280c347cc3abb81853d68835625561e3a92094b9ec443b9624252009a WatchSource:0}: Error finding container a459e59280c347cc3abb81853d68835625561e3a92094b9ec443b9624252009a: Status 404 returned error can't find the container with id a459e59280c347cc3abb81853d68835625561e3a92094b9ec443b9624252009a Apr 16 18:37:08.537516 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:08.537412 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" event={"ID":"d50a4326-bccb-45f4-a5aa-be480ba2a1a4","Type":"ContainerStarted","Data":"53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a"} Apr 16 18:37:08.537516 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:08.537461 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" event={"ID":"d50a4326-bccb-45f4-a5aa-be480ba2a1a4","Type":"ContainerStarted","Data":"a459e59280c347cc3abb81853d68835625561e3a92094b9ec443b9624252009a"} Apr 16 18:37:08.538967 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:08.538903 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"15905fe6-01d9-43a8-8a88-9020b10729d0","Type":"ContainerStarted","Data":"10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776"} Apr 16 18:37:08.538967 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:08.538935 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"15905fe6-01d9-43a8-8a88-9020b10729d0","Type":"ContainerStarted","Data":"19918820fe875f0a5002620e80b6c6e38d4d775c9d306879eac24971cecc9219"} Apr 16 18:37:09.544468 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:09.544429 2567 generic.go:358] "Generic (PLEG): container finished" podID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerID="53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a" exitCode=0 Apr 16 18:37:09.545183 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:09.544542 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" event={"ID":"d50a4326-bccb-45f4-a5aa-be480ba2a1a4","Type":"ContainerDied","Data":"53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a"} Apr 16 18:37:10.551606 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:10.551541 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" event={"ID":"d50a4326-bccb-45f4-a5aa-be480ba2a1a4","Type":"ContainerStarted","Data":"e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0"} Apr 16 18:37:10.552062 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:10.551617 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:10.552062 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:10.551634 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" event={"ID":"d50a4326-bccb-45f4-a5aa-be480ba2a1a4","Type":"ContainerStarted","Data":"9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce"} Apr 16 18:37:10.578299 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:10.578251 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" podStartSLOduration=3.578232809 podStartE2EDuration="3.578232809s" podCreationTimestamp="2026-04-16 18:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:10.577868011 +0000 UTC m=+1649.782219838" watchObservedRunningTime="2026-04-16 18:37:10.578232809 +0000 UTC m=+1649.782584630" Apr 16 18:37:12.561511 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:12.561425 2567 generic.go:358] "Generic (PLEG): container finished" podID="15905fe6-01d9-43a8-8a88-9020b10729d0" containerID="10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776" exitCode=0 Apr 16 18:37:12.561908 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:12.561510 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"15905fe6-01d9-43a8-8a88-9020b10729d0","Type":"ContainerDied","Data":"10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776"} Apr 16 18:37:18.039961 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:18.039922 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:18.041770 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:18.040965 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:18.048092 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:18.042436 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.32:8082/healthz\": dial tcp 10.134.0.32:8082: connect: connection refused" Apr 16 18:37:28.041465 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:28.041428 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:28.042865 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:28.042838 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:37:40.674524 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:40.674492 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"15905fe6-01d9-43a8-8a88-9020b10729d0","Type":"ContainerStarted","Data":"40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111"} Apr 16 18:37:40.695874 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:40.695818 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.176334737 podStartE2EDuration="33.695800048s" podCreationTimestamp="2026-04-16 18:37:07 +0000 UTC" firstStartedPulling="2026-04-16 18:37:12.562915919 +0000 UTC m=+1651.767267727" lastFinishedPulling="2026-04-16 18:37:40.082381234 +0000 UTC m=+1679.286733038" observedRunningTime="2026-04-16 18:37:40.693755774 +0000 UTC m=+1679.898107601" watchObservedRunningTime="2026-04-16 18:37:40.695800048 +0000 UTC m=+1679.900151876" Apr 16 18:37:48.631577 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:37:48.631540 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:39:55.235777 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:55.235746 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9"] Apr 16 18:39:55.236252 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:55.236062 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="main" containerID="cri-o://9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce" gracePeriod=30 Apr 16 18:39:55.236252 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:55.236134 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="tokenizer" containerID="cri-o://e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0" gracePeriod=30 Apr 16 18:39:56.144495 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.144457 2567 generic.go:358] "Generic (PLEG): container finished" podID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerID="9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce" exitCode=0 Apr 16 18:39:56.144689 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.144512 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" event={"ID":"d50a4326-bccb-45f4-a5aa-be480ba2a1a4","Type":"ContainerDied","Data":"9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce"} Apr 16 18:39:56.306679 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.306647 2567 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" secret="" err="secret \"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-6czxt\" not found" Apr 16 18:39:56.398248 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:56.398166 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 16 18:39:56.398248 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:56.398244 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs podName:15905fe6-01d9-43a8-8a88-9020b10729d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:39:56.898228782 +0000 UTC m=+1816.102580586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 16 18:39:56.488541 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.488519 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:39:56.600258 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600220 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-cache\") pod \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " Apr 16 18:39:56.600258 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600265 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tls-certs\") pod \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " Apr 16 18:39:56.600528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600286 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kserve-provision-location\") pod \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " Apr 16 18:39:56.600528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600334 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-uds\") pod \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " Apr 16 18:39:56.600528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600359 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhnsf\" (UniqueName: \"kubernetes.io/projected/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kube-api-access-dhnsf\") pod \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " Apr 16 18:39:56.600528 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600373 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-tmp\") pod \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\" (UID: \"d50a4326-bccb-45f4-a5aa-be480ba2a1a4\") " Apr 16 18:39:56.600775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600548 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d50a4326-bccb-45f4-a5aa-be480ba2a1a4" (UID: "d50a4326-bccb-45f4-a5aa-be480ba2a1a4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:56.600775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600637 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d50a4326-bccb-45f4-a5aa-be480ba2a1a4" (UID: "d50a4326-bccb-45f4-a5aa-be480ba2a1a4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:56.600877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600814 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.600877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600829 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d50a4326-bccb-45f4-a5aa-be480ba2a1a4" (UID: "d50a4326-bccb-45f4-a5aa-be480ba2a1a4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:56.600877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.600834 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-uds\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.601228 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.601200 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d50a4326-bccb-45f4-a5aa-be480ba2a1a4" (UID: "d50a4326-bccb-45f4-a5aa-be480ba2a1a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:56.602600 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.602582 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kube-api-access-dhnsf" (OuterVolumeSpecName: "kube-api-access-dhnsf") pod "d50a4326-bccb-45f4-a5aa-be480ba2a1a4" (UID: "d50a4326-bccb-45f4-a5aa-be480ba2a1a4"). InnerVolumeSpecName "kube-api-access-dhnsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:56.602817 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.602795 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d50a4326-bccb-45f4-a5aa-be480ba2a1a4" (UID: "d50a4326-bccb-45f4-a5aa-be480ba2a1a4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:56.701785 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.701696 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.701785 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.701733 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.701785 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.701750 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dhnsf\" (UniqueName: \"kubernetes.io/projected/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-kube-api-access-dhnsf\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.701785 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:56.701765 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d50a4326-bccb-45f4-a5aa-be480ba2a1a4-tokenizer-tmp\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:56.904766 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:56.904731 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 16 18:39:56.904944 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:56.904798 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs podName:15905fe6-01d9-43a8-8a88-9020b10729d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:39:57.904784944 +0000 UTC m=+1817.109136748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 16 18:39:57.149474 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.149437 2567 generic.go:358] "Generic (PLEG): container finished" podID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerID="e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0" exitCode=0 Apr 16 18:39:57.149674 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.149511 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" Apr 16 18:39:57.149674 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.149527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" event={"ID":"d50a4326-bccb-45f4-a5aa-be480ba2a1a4","Type":"ContainerDied","Data":"e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0"} Apr 16 18:39:57.149674 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.149587 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9" event={"ID":"d50a4326-bccb-45f4-a5aa-be480ba2a1a4","Type":"ContainerDied","Data":"a459e59280c347cc3abb81853d68835625561e3a92094b9ec443b9624252009a"} Apr 16 18:39:57.149674 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.149606 2567 scope.go:117] "RemoveContainer" containerID="e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0" Apr 16 18:39:57.158411 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.158394 2567 scope.go:117] "RemoveContainer" containerID="9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce" Apr 16 18:39:57.165608 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.165592 2567 scope.go:117] "RemoveContainer" containerID="53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a" Apr 16 18:39:57.172244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.172227 2567 scope.go:117] "RemoveContainer" containerID="e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0" Apr 16 18:39:57.172483 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:57.172465 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0\": container with ID starting with e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0 not found: ID does not exist" containerID="e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0" Apr 16 18:39:57.172531 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.172493 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0"} err="failed to get container status \"e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0\": rpc error: code = NotFound desc = could not find container \"e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0\": container with ID starting with e947a8eea3f475ee4b995058887b8f2cf00b25d2b1da3733a771b50988e912d0 not found: ID does not exist" Apr 16 18:39:57.172531 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.172514 2567 scope.go:117] "RemoveContainer" containerID="9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce" Apr 16 18:39:57.172745 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:57.172726 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce\": container with ID starting with 9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce not found: ID does not exist" containerID="9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce" Apr 16 18:39:57.172792 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.172752 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce"} err="failed to get container status \"9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce\": rpc error: code = NotFound desc = could not find container \"9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce\": container with ID starting with 9eb6db4137b516b6e37e9dffafbd1f76d00d54766fc93cd3f6d9414490e635ce not found: ID does not exist" Apr 16 18:39:57.172792 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.172769 2567 scope.go:117] "RemoveContainer" containerID="53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a" Apr 16 18:39:57.172991 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:57.172973 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a\": container with ID starting with 53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a not found: ID does not exist" containerID="53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a" Apr 16 18:39:57.173031 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.172998 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a"} err="failed to get container status \"53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a\": rpc error: code = NotFound desc = could not find container \"53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a\": container with ID starting with 53af7a07075eb42b087e35ddae54159adaf10e2b55dfc7791230f5254fbfab8a not found: ID does not exist" Apr 16 18:39:57.175713 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.175685 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9"] Apr 16 18:39:57.180451 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.180422 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sched79b9"] Apr 16 18:39:57.222538 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.222507 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:39:57.222761 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.222740 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="15905fe6-01d9-43a8-8a88-9020b10729d0" containerName="main" containerID="cri-o://40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111" gracePeriod=30 Apr 16 18:39:57.310950 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:57.310923 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" path="/var/lib/kubelet/pods/d50a4326-bccb-45f4-a5aa-be480ba2a1a4/volumes" Apr 16 18:39:57.913488 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:57.913455 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 16 18:39:57.913657 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:57.913529 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs podName:15905fe6-01d9-43a8-8a88-9020b10729d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:39:59.913515758 +0000 UTC m=+1819.117867562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 16 18:39:58.056818 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.056795 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:39:58.114613 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.114540 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-model-cache\") pod \"15905fe6-01d9-43a8-8a88-9020b10729d0\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " Apr 16 18:39:58.114613 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.114591 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-home\") pod \"15905fe6-01d9-43a8-8a88-9020b10729d0\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " Apr 16 18:39:58.114778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.114635 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs\") pod \"15905fe6-01d9-43a8-8a88-9020b10729d0\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " Apr 16 18:39:58.114778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.114690 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-dshm\") pod \"15905fe6-01d9-43a8-8a88-9020b10729d0\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " Apr 16 18:39:58.114778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.114718 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-kserve-provision-location\") pod \"15905fe6-01d9-43a8-8a88-9020b10729d0\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " Apr 16 18:39:58.114778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.114763 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwr54\" (UniqueName: \"kubernetes.io/projected/15905fe6-01d9-43a8-8a88-9020b10729d0-kube-api-access-wwr54\") pod \"15905fe6-01d9-43a8-8a88-9020b10729d0\" (UID: \"15905fe6-01d9-43a8-8a88-9020b10729d0\") " Apr 16 18:39:58.114954 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.114826 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-model-cache" (OuterVolumeSpecName: "model-cache") pod "15905fe6-01d9-43a8-8a88-9020b10729d0" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:58.115080 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.115059 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-model-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:58.115160 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.115088 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-home" (OuterVolumeSpecName: "home") pod "15905fe6-01d9-43a8-8a88-9020b10729d0" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:58.116833 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.116810 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "15905fe6-01d9-43a8-8a88-9020b10729d0" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:58.116956 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.116889 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15905fe6-01d9-43a8-8a88-9020b10729d0-kube-api-access-wwr54" (OuterVolumeSpecName: "kube-api-access-wwr54") pod "15905fe6-01d9-43a8-8a88-9020b10729d0" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0"). InnerVolumeSpecName "kube-api-access-wwr54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:58.116956 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.116918 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-dshm" (OuterVolumeSpecName: "dshm") pod "15905fe6-01d9-43a8-8a88-9020b10729d0" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:58.154816 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.154786 2567 generic.go:358] "Generic (PLEG): container finished" podID="15905fe6-01d9-43a8-8a88-9020b10729d0" containerID="40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111" exitCode=0 Apr 16 18:39:58.154974 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.154836 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"15905fe6-01d9-43a8-8a88-9020b10729d0","Type":"ContainerDied","Data":"40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111"} Apr 16 18:39:58.154974 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.154857 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:39:58.154974 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.154857 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"15905fe6-01d9-43a8-8a88-9020b10729d0","Type":"ContainerDied","Data":"19918820fe875f0a5002620e80b6c6e38d4d775c9d306879eac24971cecc9219"} Apr 16 18:39:58.154974 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.154887 2567 scope.go:117] "RemoveContainer" containerID="40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111" Apr 16 18:39:58.171793 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.171766 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15905fe6-01d9-43a8-8a88-9020b10729d0" (UID: "15905fe6-01d9-43a8-8a88-9020b10729d0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:58.173149 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.173128 2567 scope.go:117] "RemoveContainer" containerID="10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776" Apr 16 18:39:58.216519 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.216492 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-dshm\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:58.216519 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.216517 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:58.216713 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.216528 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwr54\" (UniqueName: \"kubernetes.io/projected/15905fe6-01d9-43a8-8a88-9020b10729d0-kube-api-access-wwr54\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:58.216713 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.216540 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15905fe6-01d9-43a8-8a88-9020b10729d0-home\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:58.216713 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.216549 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15905fe6-01d9-43a8-8a88-9020b10729d0-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:39:58.232991 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.232973 2567 scope.go:117] "RemoveContainer" containerID="40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111" Apr 16 18:39:58.233326 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:58.233295 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111\": container with ID starting with 40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111 not found: ID does not exist" containerID="40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111" Apr 16 18:39:58.233420 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.233326 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111"} err="failed to get container status \"40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111\": rpc error: code = NotFound desc = could not find container \"40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111\": container with ID starting with 40e16e743b80b3875c1c540a3c7119c6f63377589f69595b31799746dec9a111 not found: ID does not exist" Apr 16 18:39:58.233420 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.233346 2567 scope.go:117] "RemoveContainer" containerID="10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776" Apr 16 18:39:58.233651 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:39:58.233615 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776\": container with ID starting with 10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776 not found: ID does not exist" containerID="10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776" Apr 16 18:39:58.233651 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.233641 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776"} err="failed to get container status \"10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776\": rpc error: code = NotFound desc = could not find container \"10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776\": container with ID starting with 10c33214a931e9d15bd1a81817af3e89cdc1cb18b99d24bc304c2b6342a1d776 not found: ID does not exist" Apr 16 18:39:58.478429 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.478398 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:39:58.482803 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:58.482779 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:39:59.311126 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:39:59.311083 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15905fe6-01d9-43a8-8a88-9020b10729d0" path="/var/lib/kubelet/pods/15905fe6-01d9-43a8-8a88-9020b10729d0/volumes" Apr 16 18:40:22.874917 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.874873 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk"] Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875237 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15905fe6-01d9-43a8-8a88-9020b10729d0" containerName="storage-initializer" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875252 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="15905fe6-01d9-43a8-8a88-9020b10729d0" containerName="storage-initializer" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875265 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="main" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875273 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="main" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875283 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15905fe6-01d9-43a8-8a88-9020b10729d0" containerName="main" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875291 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="15905fe6-01d9-43a8-8a88-9020b10729d0" containerName="main" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875308 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="storage-initializer" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875317 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="storage-initializer" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875328 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="tokenizer" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875335 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="tokenizer" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875424 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="15905fe6-01d9-43a8-8a88-9020b10729d0" containerName="main" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875437 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="tokenizer" Apr 16 18:40:22.875503 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.875449 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d50a4326-bccb-45f4-a5aa-be480ba2a1a4" containerName="main" Apr 16 18:40:22.878473 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.878457 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:22.881426 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.881401 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 18:40:22.881607 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.881484 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:40:22.887349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:22.887265 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk"] Apr 16 18:40:23.037729 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.037699 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-model-cache\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.037729 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.037732 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25btm\" (UniqueName: \"kubernetes.io/projected/6195eefa-8be8-4477-bda2-6585d515ce75-kube-api-access-25btm\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.037933 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.037753 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-dshm\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.037933 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.037841 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-home\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.037933 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.037876 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6195eefa-8be8-4477-bda2-6585d515ce75-tls-certs\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.037933 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.037892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.123331 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.123297 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg"] Apr 16 18:40:23.126740 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.126655 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.129011 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.128993 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-69chv\"" Apr 16 18:40:23.137010 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.136990 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg"] Apr 16 18:40:23.138280 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138259 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-home\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.138387 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6195eefa-8be8-4477-bda2-6585d515ce75-tls-certs\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.138387 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138348 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.138506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-model-cache\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.138506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138442 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25btm\" (UniqueName: \"kubernetes.io/projected/6195eefa-8be8-4477-bda2-6585d515ce75-kube-api-access-25btm\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.138506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138472 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-dshm\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.138678 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138643 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-home\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.138831 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138796 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-model-cache\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.138957 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.138896 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.141265 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.141172 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-dshm\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.141567 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.141526 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6195eefa-8be8-4477-bda2-6585d515ce75-tls-certs\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.152804 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.151771 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25btm\" (UniqueName: \"kubernetes.io/projected/6195eefa-8be8-4477-bda2-6585d515ce75-kube-api-access-25btm\") pod \"scheduler-inline-config-test-kserve-7b6bc495cf-465xk\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.191160 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.191132 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:23.239791 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.239763 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.239934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.239795 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.239934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.239824 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.239934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.239844 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.240127 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.239940 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a921433d-5350-4a29-9aeb-6401bcaf1513-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.240127 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.240004 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jhz\" (UniqueName: \"kubernetes.io/projected/a921433d-5350-4a29-9aeb-6401bcaf1513-kube-api-access-q6jhz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.314148 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.313942 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk"] Apr 16 18:40:23.316734 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:40:23.316707 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6195eefa_8be8_4477_bda2_6585d515ce75.slice/crio-4b9f65185b55ebe0130544a050d0645e903639a0f486986c113145e09e044a52 WatchSource:0}: Error finding container 4b9f65185b55ebe0130544a050d0645e903639a0f486986c113145e09e044a52: Status 404 returned error can't find the container with id 4b9f65185b55ebe0130544a050d0645e903639a0f486986c113145e09e044a52 Apr 16 18:40:23.341033 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341008 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jhz\" (UniqueName: \"kubernetes.io/projected/a921433d-5350-4a29-9aeb-6401bcaf1513-kube-api-access-q6jhz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341108 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341181 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341142 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341298 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341186 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341298 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341217 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341298 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a921433d-5350-4a29-9aeb-6401bcaf1513-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341431 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341412 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341486 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341548 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341500 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.341608 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.341584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.343536 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.343515 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a921433d-5350-4a29-9aeb-6401bcaf1513-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.349416 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.349399 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jhz\" (UniqueName: \"kubernetes.io/projected/a921433d-5350-4a29-9aeb-6401bcaf1513-kube-api-access-q6jhz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.436356 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.436272 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:23.569477 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:23.569451 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg"] Apr 16 18:40:23.571516 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:40:23.571487 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda921433d_5350_4a29_9aeb_6401bcaf1513.slice/crio-b04a7453549201d5f6233a2a1c97640fe2d2931db3d7288ee6667d84d666f48e WatchSource:0}: Error finding container b04a7453549201d5f6233a2a1c97640fe2d2931db3d7288ee6667d84d666f48e: Status 404 returned error can't find the container with id b04a7453549201d5f6233a2a1c97640fe2d2931db3d7288ee6667d84d666f48e Apr 16 18:40:24.240894 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:24.240841 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" event={"ID":"6195eefa-8be8-4477-bda2-6585d515ce75","Type":"ContainerStarted","Data":"20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219"} Apr 16 18:40:24.240894 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:24.240905 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" event={"ID":"6195eefa-8be8-4477-bda2-6585d515ce75","Type":"ContainerStarted","Data":"4b9f65185b55ebe0130544a050d0645e903639a0f486986c113145e09e044a52"} Apr 16 18:40:24.242217 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:24.242193 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" event={"ID":"a921433d-5350-4a29-9aeb-6401bcaf1513","Type":"ContainerStarted","Data":"f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96"} Apr 16 18:40:24.242217 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:24.242219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" event={"ID":"a921433d-5350-4a29-9aeb-6401bcaf1513","Type":"ContainerStarted","Data":"b04a7453549201d5f6233a2a1c97640fe2d2931db3d7288ee6667d84d666f48e"} Apr 16 18:40:25.246748 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:25.246712 2567 generic.go:358] "Generic (PLEG): container finished" podID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerID="f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96" exitCode=0 Apr 16 18:40:25.247116 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:25.246815 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" event={"ID":"a921433d-5350-4a29-9aeb-6401bcaf1513","Type":"ContainerDied","Data":"f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96"} Apr 16 18:40:26.252442 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:26.252401 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" event={"ID":"a921433d-5350-4a29-9aeb-6401bcaf1513","Type":"ContainerStarted","Data":"e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0"} Apr 16 18:40:26.252442 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:26.252445 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" event={"ID":"a921433d-5350-4a29-9aeb-6401bcaf1513","Type":"ContainerStarted","Data":"b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5"} Apr 16 18:40:26.252890 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:26.252544 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:26.275508 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:26.275463 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" podStartSLOduration=3.275446505 podStartE2EDuration="3.275446505s" podCreationTimestamp="2026-04-16 18:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:26.273059019 +0000 UTC m=+1845.477410845" watchObservedRunningTime="2026-04-16 18:40:26.275446505 +0000 UTC m=+1845.479798332" Apr 16 18:40:28.260803 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:28.260763 2567 generic.go:358] "Generic (PLEG): container finished" podID="6195eefa-8be8-4477-bda2-6585d515ce75" containerID="20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219" exitCode=0 Apr 16 18:40:28.261294 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:28.260838 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" event={"ID":"6195eefa-8be8-4477-bda2-6585d515ce75","Type":"ContainerDied","Data":"20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219"} Apr 16 18:40:29.265460 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:29.265427 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" event={"ID":"6195eefa-8be8-4477-bda2-6585d515ce75","Type":"ContainerStarted","Data":"c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf"} Apr 16 18:40:29.285948 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:29.285896 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" podStartSLOduration=7.2858809 podStartE2EDuration="7.2858809s" podCreationTimestamp="2026-04-16 18:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:29.284239199 +0000 UTC m=+1848.488591030" watchObservedRunningTime="2026-04-16 18:40:29.2858809 +0000 UTC m=+1848.490232727" Apr 16 18:40:33.191278 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:33.191233 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:33.191770 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:33.191303 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:33.203813 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:33.203789 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:33.289818 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:33.289790 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:40:33.437091 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:33.436998 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:33.437293 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:33.437118 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:33.439760 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:33.439732 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:34.282230 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:34.282205 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:40:55.286944 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:40:55.286916 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:41:15.898456 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.898371 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d"] Apr 16 18:41:15.901929 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.901909 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:15.904202 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.904182 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 18:41:15.904279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.904210 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-lqq4l\"" Apr 16 18:41:15.914877 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.914852 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d"] Apr 16 18:41:15.996214 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.996179 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/324efcca-ef3c-4fea-82ac-6083acfd6160-tls-certs\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:15.996214 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.996213 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl2b5\" (UniqueName: \"kubernetes.io/projected/324efcca-ef3c-4fea-82ac-6083acfd6160-kube-api-access-xl2b5\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:15.996447 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.996304 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-home\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:15.996447 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.996371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-model-cache\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:15.996447 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.996396 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-dshm\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:15.996447 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:15.996417 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.097492 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.097457 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-model-cache\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.097705 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.097500 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-dshm\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.097705 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.097522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.097705 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.097649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/324efcca-ef3c-4fea-82ac-6083acfd6160-tls-certs\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.097705 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.097682 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl2b5\" (UniqueName: \"kubernetes.io/projected/324efcca-ef3c-4fea-82ac-6083acfd6160-kube-api-access-xl2b5\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.097934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.097747 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-home\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.097934 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.097921 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-model-cache\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.098024 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.097959 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.098092 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.098076 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-home\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.099965 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.099940 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-dshm\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.100210 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.100195 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/324efcca-ef3c-4fea-82ac-6083acfd6160-tls-certs\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.106832 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.106809 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl2b5\" (UniqueName: \"kubernetes.io/projected/324efcca-ef3c-4fea-82ac-6083acfd6160-kube-api-access-xl2b5\") pod \"router-with-refs-pd-test-kserve-5d9568576b-xpf5d\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.211661 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.211546 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:16.337460 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.337428 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d"] Apr 16 18:41:16.340233 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:41:16.340206 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod324efcca_ef3c_4fea_82ac_6083acfd6160.slice/crio-a4743be4450aa1447749a094c86d5a44e8890a6c23d38dec82a79fb5acf97343 WatchSource:0}: Error finding container a4743be4450aa1447749a094c86d5a44e8890a6c23d38dec82a79fb5acf97343: Status 404 returned error can't find the container with id a4743be4450aa1447749a094c86d5a44e8890a6c23d38dec82a79fb5acf97343 Apr 16 18:41:16.414547 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:16.414514 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" event={"ID":"324efcca-ef3c-4fea-82ac-6083acfd6160","Type":"ContainerStarted","Data":"a4743be4450aa1447749a094c86d5a44e8890a6c23d38dec82a79fb5acf97343"} Apr 16 18:41:17.419435 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:17.419339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" event={"ID":"324efcca-ef3c-4fea-82ac-6083acfd6160","Type":"ContainerStarted","Data":"fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f"} Apr 16 18:41:17.419855 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:17.419648 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:17.791189 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:17.791099 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg"] Apr 16 18:41:17.791608 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:17.791532 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="main" containerID="cri-o://b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5" gracePeriod=30 Apr 16 18:41:17.791709 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:17.791611 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="tokenizer" containerID="cri-o://e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0" gracePeriod=30 Apr 16 18:41:17.798023 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:17.797995 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk"] Apr 16 18:41:17.798305 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:17.798278 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" podUID="6195eefa-8be8-4477-bda2-6585d515ce75" containerName="main" containerID="cri-o://c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf" gracePeriod=30 Apr 16 18:41:18.062914 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.062887 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:41:18.223506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.223476 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-home\") pod \"6195eefa-8be8-4477-bda2-6585d515ce75\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " Apr 16 18:41:18.223732 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.223538 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25btm\" (UniqueName: \"kubernetes.io/projected/6195eefa-8be8-4477-bda2-6585d515ce75-kube-api-access-25btm\") pod \"6195eefa-8be8-4477-bda2-6585d515ce75\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " Apr 16 18:41:18.223732 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.223608 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-model-cache\") pod \"6195eefa-8be8-4477-bda2-6585d515ce75\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " Apr 16 18:41:18.223732 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.223633 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-kserve-provision-location\") pod \"6195eefa-8be8-4477-bda2-6585d515ce75\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " Apr 16 18:41:18.223732 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.223669 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-dshm\") pod \"6195eefa-8be8-4477-bda2-6585d515ce75\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " Apr 16 18:41:18.223732 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.223722 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6195eefa-8be8-4477-bda2-6585d515ce75-tls-certs\") pod \"6195eefa-8be8-4477-bda2-6585d515ce75\" (UID: \"6195eefa-8be8-4477-bda2-6585d515ce75\") " Apr 16 18:41:18.223977 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.223771 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-home" (OuterVolumeSpecName: "home") pod "6195eefa-8be8-4477-bda2-6585d515ce75" (UID: "6195eefa-8be8-4477-bda2-6585d515ce75"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:18.223977 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.223856 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-model-cache" (OuterVolumeSpecName: "model-cache") pod "6195eefa-8be8-4477-bda2-6585d515ce75" (UID: "6195eefa-8be8-4477-bda2-6585d515ce75"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:18.224072 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.224040 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-home\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:18.224072 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.224059 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-model-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:18.226099 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.226045 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-dshm" (OuterVolumeSpecName: "dshm") pod "6195eefa-8be8-4477-bda2-6585d515ce75" (UID: "6195eefa-8be8-4477-bda2-6585d515ce75"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:18.226099 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.226046 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6195eefa-8be8-4477-bda2-6585d515ce75-kube-api-access-25btm" (OuterVolumeSpecName: "kube-api-access-25btm") pod "6195eefa-8be8-4477-bda2-6585d515ce75" (UID: "6195eefa-8be8-4477-bda2-6585d515ce75"). InnerVolumeSpecName "kube-api-access-25btm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:41:18.226296 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.226275 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6195eefa-8be8-4477-bda2-6585d515ce75-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6195eefa-8be8-4477-bda2-6585d515ce75" (UID: "6195eefa-8be8-4477-bda2-6585d515ce75"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:41:18.290123 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.290069 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6195eefa-8be8-4477-bda2-6585d515ce75" (UID: "6195eefa-8be8-4477-bda2-6585d515ce75"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:18.324773 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.324685 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-dshm\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:18.324773 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.324718 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6195eefa-8be8-4477-bda2-6585d515ce75-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:18.324773 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.324732 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-25btm\" (UniqueName: \"kubernetes.io/projected/6195eefa-8be8-4477-bda2-6585d515ce75-kube-api-access-25btm\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:18.324773 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.324745 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6195eefa-8be8-4477-bda2-6585d515ce75-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:18.427158 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.427122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" event={"ID":"324efcca-ef3c-4fea-82ac-6083acfd6160","Type":"ContainerStarted","Data":"2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1"} Apr 16 18:41:18.429292 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.429267 2567 generic.go:358] "Generic (PLEG): container finished" podID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerID="b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5" exitCode=0 Apr 16 18:41:18.429459 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.429327 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" event={"ID":"a921433d-5350-4a29-9aeb-6401bcaf1513","Type":"ContainerDied","Data":"b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5"} Apr 16 18:41:18.430801 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.430771 2567 generic.go:358] "Generic (PLEG): container finished" podID="6195eefa-8be8-4477-bda2-6585d515ce75" containerID="c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf" exitCode=0 Apr 16 18:41:18.430912 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.430853 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" Apr 16 18:41:18.430912 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.430892 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" event={"ID":"6195eefa-8be8-4477-bda2-6585d515ce75","Type":"ContainerDied","Data":"c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf"} Apr 16 18:41:18.431026 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.430929 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk" event={"ID":"6195eefa-8be8-4477-bda2-6585d515ce75","Type":"ContainerDied","Data":"4b9f65185b55ebe0130544a050d0645e903639a0f486986c113145e09e044a52"} Apr 16 18:41:18.431026 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.430950 2567 scope.go:117] "RemoveContainer" containerID="c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf" Apr 16 18:41:18.441451 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.441427 2567 scope.go:117] "RemoveContainer" containerID="20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219" Apr 16 18:41:18.456519 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.456498 2567 scope.go:117] "RemoveContainer" containerID="c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf" Apr 16 18:41:18.456879 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:41:18.456857 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf\": container with ID starting with c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf not found: ID does not exist" containerID="c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf" Apr 16 18:41:18.456990 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.456886 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf"} err="failed to get container status \"c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf\": rpc error: code = NotFound desc = could not find container \"c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf\": container with ID starting with c35474fc1b933804c7f1be97613484d8369e0c3045d50cc19c996dbf1ebc65bf not found: ID does not exist" Apr 16 18:41:18.456990 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.456913 2567 scope.go:117] "RemoveContainer" containerID="20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219" Apr 16 18:41:18.457185 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:41:18.457167 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219\": container with ID starting with 20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219 not found: ID does not exist" containerID="20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219" Apr 16 18:41:18.457240 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.457192 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219"} err="failed to get container status \"20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219\": rpc error: code = NotFound desc = could not find container \"20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219\": container with ID starting with 20a963c65209918b4f530ae19ef165f61e233aa4539e4bab1915b5fd5166c219 not found: ID does not exist" Apr 16 18:41:18.464859 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.464159 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk"] Apr 16 18:41:18.469775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:18.469749 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7b6bc495cf-465xk"] Apr 16 18:41:19.144901 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.144879 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:41:19.233744 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.233652 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-tmp\") pod \"a921433d-5350-4a29-9aeb-6401bcaf1513\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " Apr 16 18:41:19.233744 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.233697 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6jhz\" (UniqueName: \"kubernetes.io/projected/a921433d-5350-4a29-9aeb-6401bcaf1513-kube-api-access-q6jhz\") pod \"a921433d-5350-4a29-9aeb-6401bcaf1513\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " Apr 16 18:41:19.233952 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.233775 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-cache\") pod \"a921433d-5350-4a29-9aeb-6401bcaf1513\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " Apr 16 18:41:19.233952 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.233844 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-uds\") pod \"a921433d-5350-4a29-9aeb-6401bcaf1513\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " Apr 16 18:41:19.233952 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.233872 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-kserve-provision-location\") pod \"a921433d-5350-4a29-9aeb-6401bcaf1513\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " Apr 16 18:41:19.233952 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.233928 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a921433d-5350-4a29-9aeb-6401bcaf1513-tls-certs\") pod \"a921433d-5350-4a29-9aeb-6401bcaf1513\" (UID: \"a921433d-5350-4a29-9aeb-6401bcaf1513\") " Apr 16 18:41:19.234158 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.234064 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a921433d-5350-4a29-9aeb-6401bcaf1513" (UID: "a921433d-5350-4a29-9aeb-6401bcaf1513"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:19.234158 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.234087 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a921433d-5350-4a29-9aeb-6401bcaf1513" (UID: "a921433d-5350-4a29-9aeb-6401bcaf1513"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:19.234158 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.234103 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a921433d-5350-4a29-9aeb-6401bcaf1513" (UID: "a921433d-5350-4a29-9aeb-6401bcaf1513"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:19.234313 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.234285 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-tmp\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:19.234313 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.234306 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:19.234413 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.234323 2567 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-tokenizer-uds\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:19.234687 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.234663 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a921433d-5350-4a29-9aeb-6401bcaf1513" (UID: "a921433d-5350-4a29-9aeb-6401bcaf1513"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:19.235929 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.235897 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a921433d-5350-4a29-9aeb-6401bcaf1513-kube-api-access-q6jhz" (OuterVolumeSpecName: "kube-api-access-q6jhz") pod "a921433d-5350-4a29-9aeb-6401bcaf1513" (UID: "a921433d-5350-4a29-9aeb-6401bcaf1513"). InnerVolumeSpecName "kube-api-access-q6jhz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:41:19.236075 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.236059 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a921433d-5350-4a29-9aeb-6401bcaf1513-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a921433d-5350-4a29-9aeb-6401bcaf1513" (UID: "a921433d-5350-4a29-9aeb-6401bcaf1513"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:41:19.311800 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.311763 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6195eefa-8be8-4477-bda2-6585d515ce75" path="/var/lib/kubelet/pods/6195eefa-8be8-4477-bda2-6585d515ce75/volumes" Apr 16 18:41:19.335401 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.335370 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a921433d-5350-4a29-9aeb-6401bcaf1513-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:19.335401 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.335400 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a921433d-5350-4a29-9aeb-6401bcaf1513-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:19.335595 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.335411 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q6jhz\" (UniqueName: \"kubernetes.io/projected/a921433d-5350-4a29-9aeb-6401bcaf1513-kube-api-access-q6jhz\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:41:19.436462 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.436421 2567 generic.go:358] "Generic (PLEG): container finished" podID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerID="e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0" exitCode=0 Apr 16 18:41:19.436927 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.436499 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" event={"ID":"a921433d-5350-4a29-9aeb-6401bcaf1513","Type":"ContainerDied","Data":"e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0"} Apr 16 18:41:19.436927 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.436532 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" Apr 16 18:41:19.436927 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.436546 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg" event={"ID":"a921433d-5350-4a29-9aeb-6401bcaf1513","Type":"ContainerDied","Data":"b04a7453549201d5f6233a2a1c97640fe2d2931db3d7288ee6667d84d666f48e"} Apr 16 18:41:19.436927 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.436602 2567 scope.go:117] "RemoveContainer" containerID="e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0" Apr 16 18:41:19.446801 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.446778 2567 scope.go:117] "RemoveContainer" containerID="b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5" Apr 16 18:41:19.455543 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.455520 2567 scope.go:117] "RemoveContainer" containerID="f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96" Apr 16 18:41:19.456349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.456325 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg"] Apr 16 18:41:19.459825 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.459801 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-b64b5bcxmg"] Apr 16 18:41:19.464456 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.464433 2567 scope.go:117] "RemoveContainer" containerID="e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0" Apr 16 18:41:19.464796 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:41:19.464775 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0\": container with ID starting with e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0 not found: ID does not exist" containerID="e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0" Apr 16 18:41:19.464879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.464808 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0"} err="failed to get container status \"e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0\": rpc error: code = NotFound desc = could not find container \"e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0\": container with ID starting with e0d804e01b1f3b7facacab4a3831acf29268dd19d1f03f6d099d282f039fd7e0 not found: ID does not exist" Apr 16 18:41:19.464879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.464836 2567 scope.go:117] "RemoveContainer" containerID="b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5" Apr 16 18:41:19.465123 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:41:19.465106 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5\": container with ID starting with b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5 not found: ID does not exist" containerID="b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5" Apr 16 18:41:19.465163 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.465130 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5"} err="failed to get container status \"b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5\": rpc error: code = NotFound desc = could not find container \"b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5\": container with ID starting with b1151988aaa366bc2c78c6cfc99a0aa1d0f9854ad53f26431811848197fa93a5 not found: ID does not exist" Apr 16 18:41:19.465163 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.465147 2567 scope.go:117] "RemoveContainer" containerID="f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96" Apr 16 18:41:19.465371 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:41:19.465355 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96\": container with ID starting with f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96 not found: ID does not exist" containerID="f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96" Apr 16 18:41:19.465413 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:19.465375 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96"} err="failed to get container status \"f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96\": rpc error: code = NotFound desc = could not find container \"f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96\": container with ID starting with f53cb01a243299319aac7025bf7d2126208bcecbcc3f03cf578f2671574a6b96 not found: ID does not exist" Apr 16 18:41:21.312743 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:21.312708 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" path="/var/lib/kubelet/pods/a921433d-5350-4a29-9aeb-6401bcaf1513/volumes" Apr 16 18:41:21.448265 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:21.448236 2567 generic.go:358] "Generic (PLEG): container finished" podID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerID="2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1" exitCode=0 Apr 16 18:41:21.448373 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:21.448293 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" event={"ID":"324efcca-ef3c-4fea-82ac-6083acfd6160","Type":"ContainerDied","Data":"2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1"} Apr 16 18:41:22.453579 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:22.453526 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" event={"ID":"324efcca-ef3c-4fea-82ac-6083acfd6160","Type":"ContainerStarted","Data":"ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24"} Apr 16 18:41:22.478176 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:22.478114 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podStartSLOduration=6.663414481 podStartE2EDuration="7.478096249s" podCreationTimestamp="2026-04-16 18:41:15 +0000 UTC" firstStartedPulling="2026-04-16 18:41:16.342196362 +0000 UTC m=+1895.546548167" lastFinishedPulling="2026-04-16 18:41:17.156878131 +0000 UTC m=+1896.361229935" observedRunningTime="2026-04-16 18:41:22.47604784 +0000 UTC m=+1901.680399665" watchObservedRunningTime="2026-04-16 18:41:22.478096249 +0000 UTC m=+1901.682448076" Apr 16 18:41:26.212655 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:26.212613 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:26.213155 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:26.212667 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:26.213652 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:26.213623 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:41:36.212712 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:36.212660 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:41:36.226100 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:36.226064 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:41:46.211894 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:46.211848 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:41:56.212767 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:41:56.212707 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:42:06.212784 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:42:06.212730 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:42:16.212444 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:42:16.212394 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:42:26.212190 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:42:26.212139 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:42:36.212577 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:42:36.212440 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:42:46.211991 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:42:46.211943 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 18:42:56.221806 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:42:56.221777 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:42:56.233931 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:42:56.233908 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:43:17.832100 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:17.832051 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d"] Apr 16 18:43:17.832542 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:17.832378 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" containerID="cri-o://ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24" gracePeriod=30 Apr 16 18:43:33.475653 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:33.475625 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:33.560551 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:33.560520 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:33.570873 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:33.570852 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:33.584320 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:33.584303 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:34.600262 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:34.600233 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:34.641141 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:34.641113 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:34.648749 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:34.648731 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:34.660692 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:34.660659 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:35.675588 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:35.675547 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:35.714082 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:35.714056 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:35.722099 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:35.722086 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:35.733997 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:35.733976 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:36.738689 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:36.738662 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:36.779047 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:36.779021 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:36.786234 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:36.786209 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:36.799402 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:36.799383 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:37.809847 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:37.809822 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:37.879865 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:37.879839 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:37.892590 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:37.892572 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:37.906111 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:37.906093 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:38.903572 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:38.903534 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:38.944808 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:38.944787 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:38.953097 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:38.953073 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:38.965879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:38.965858 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:39.958400 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:39.958369 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:39.997935 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:39.997912 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:40.006117 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:40.006099 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:40.025788 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:40.025767 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:41.006663 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:41.006638 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:41.054354 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:41.054327 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:41.065501 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:41.065480 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:41.080151 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:41.080131 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:42.078351 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:42.078321 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:42.124631 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:42.124601 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:42.135526 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:42.135507 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:42.153588 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:42.153567 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:43.151465 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:43.151431 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:43.193118 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:43.193097 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:43.200669 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:43.200647 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:43.212496 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:43.212476 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:44.197881 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:44.197853 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:44.239257 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:44.239236 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:44.246765 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:44.246748 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:44.258349 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:44.258333 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:45.255468 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:45.255437 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:45.296896 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:45.296873 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:45.304855 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:45.304836 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:45.316446 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:45.316427 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:46.456244 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:46.456211 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:46.500654 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:46.500631 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:46.509761 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:46.509744 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:46.523954 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:46.523918 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:47.560815 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:47.560789 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-9kd8d_30a2bae1-62f9-4614-ab04-2a74ee4cbd67/istio-proxy/0.log" Apr 16 18:43:47.601862 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:47.601837 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:47.611627 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:47.611606 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/llm-d-routing-sidecar/0.log" Apr 16 18:43:47.623629 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:47.623610 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/storage-initializer/0.log" Apr 16 18:43:47.833443 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:47.833394 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="llm-d-routing-sidecar" containerID="cri-o://fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f" gracePeriod=2 Apr 16 18:43:47.950432 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:47.950405 2567 generic.go:358] "Generic (PLEG): container finished" podID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerID="fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f" exitCode=0 Apr 16 18:43:47.950545 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:47.950480 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" event={"ID":"324efcca-ef3c-4fea-82ac-6083acfd6160","Type":"ContainerDied","Data":"fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f"} Apr 16 18:43:48.085673 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.085617 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:48.086245 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.086231 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:43:48.259611 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.259576 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-dshm\") pod \"324efcca-ef3c-4fea-82ac-6083acfd6160\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " Apr 16 18:43:48.259797 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.259633 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl2b5\" (UniqueName: \"kubernetes.io/projected/324efcca-ef3c-4fea-82ac-6083acfd6160-kube-api-access-xl2b5\") pod \"324efcca-ef3c-4fea-82ac-6083acfd6160\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " Apr 16 18:43:48.259797 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.259665 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/324efcca-ef3c-4fea-82ac-6083acfd6160-tls-certs\") pod \"324efcca-ef3c-4fea-82ac-6083acfd6160\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " Apr 16 18:43:48.259797 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.259685 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-home\") pod \"324efcca-ef3c-4fea-82ac-6083acfd6160\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " Apr 16 18:43:48.259797 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.259711 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-kserve-provision-location\") pod \"324efcca-ef3c-4fea-82ac-6083acfd6160\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " Apr 16 18:43:48.260017 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.259793 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-model-cache\") pod \"324efcca-ef3c-4fea-82ac-6083acfd6160\" (UID: \"324efcca-ef3c-4fea-82ac-6083acfd6160\") " Apr 16 18:43:48.260090 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.260011 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-model-cache" (OuterVolumeSpecName: "model-cache") pod "324efcca-ef3c-4fea-82ac-6083acfd6160" (UID: "324efcca-ef3c-4fea-82ac-6083acfd6160"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:48.260143 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.260084 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-home" (OuterVolumeSpecName: "home") pod "324efcca-ef3c-4fea-82ac-6083acfd6160" (UID: "324efcca-ef3c-4fea-82ac-6083acfd6160"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:48.260143 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.260108 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-model-cache\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.261900 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.261875 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324efcca-ef3c-4fea-82ac-6083acfd6160-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "324efcca-ef3c-4fea-82ac-6083acfd6160" (UID: "324efcca-ef3c-4fea-82ac-6083acfd6160"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:43:48.262189 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.262166 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-dshm" (OuterVolumeSpecName: "dshm") pod "324efcca-ef3c-4fea-82ac-6083acfd6160" (UID: "324efcca-ef3c-4fea-82ac-6083acfd6160"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:48.262279 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.262232 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324efcca-ef3c-4fea-82ac-6083acfd6160-kube-api-access-xl2b5" (OuterVolumeSpecName: "kube-api-access-xl2b5") pod "324efcca-ef3c-4fea-82ac-6083acfd6160" (UID: "324efcca-ef3c-4fea-82ac-6083acfd6160"). InnerVolumeSpecName "kube-api-access-xl2b5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:43:48.318041 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.318010 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "324efcca-ef3c-4fea-82ac-6083acfd6160" (UID: "324efcca-ef3c-4fea-82ac-6083acfd6160"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:48.361222 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.361157 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-dshm\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.361222 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.361180 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xl2b5\" (UniqueName: \"kubernetes.io/projected/324efcca-ef3c-4fea-82ac-6083acfd6160-kube-api-access-xl2b5\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.361222 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.361196 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/324efcca-ef3c-4fea-82ac-6083acfd6160-tls-certs\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.361222 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.361209 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-home\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.361222 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.361221 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/324efcca-ef3c-4fea-82ac-6083acfd6160-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 16 18:43:48.955485 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.955459 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5d9568576b-xpf5d_324efcca-ef3c-4fea-82ac-6083acfd6160/main/0.log" Apr 16 18:43:48.956102 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.956081 2567 generic.go:358] "Generic (PLEG): container finished" podID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerID="ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24" exitCode=137 Apr 16 18:43:48.956183 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.956139 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" event={"ID":"324efcca-ef3c-4fea-82ac-6083acfd6160","Type":"ContainerDied","Data":"ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24"} Apr 16 18:43:48.956183 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.956161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" event={"ID":"324efcca-ef3c-4fea-82ac-6083acfd6160","Type":"ContainerDied","Data":"a4743be4450aa1447749a094c86d5a44e8890a6c23d38dec82a79fb5acf97343"} Apr 16 18:43:48.956183 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.956160 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d" Apr 16 18:43:48.956332 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.956236 2567 scope.go:117] "RemoveContainer" containerID="ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24" Apr 16 18:43:48.978506 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.978489 2567 scope.go:117] "RemoveContainer" containerID="2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1" Apr 16 18:43:48.980232 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.980213 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d"] Apr 16 18:43:48.983367 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.983344 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5d9568576b-xpf5d"] Apr 16 18:43:48.988602 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.988578 2567 scope.go:117] "RemoveContainer" containerID="fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f" Apr 16 18:43:48.995464 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.995443 2567 scope.go:117] "RemoveContainer" containerID="ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24" Apr 16 18:43:48.995732 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:43:48.995715 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24\": container with ID starting with ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24 not found: ID does not exist" containerID="ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24" Apr 16 18:43:48.995803 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.995739 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24"} err="failed to get container status \"ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24\": rpc error: code = NotFound desc = could not find container \"ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24\": container with ID starting with ad37c50e909a12db19d5ae0cc91cceea05bdd93031bd757a833e5d871367cc24 not found: ID does not exist" Apr 16 18:43:48.995803 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.995757 2567 scope.go:117] "RemoveContainer" containerID="2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1" Apr 16 18:43:48.995943 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:43:48.995929 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1\": container with ID starting with 2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1 not found: ID does not exist" containerID="2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1" Apr 16 18:43:48.995981 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.995947 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1"} err="failed to get container status \"2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1\": rpc error: code = NotFound desc = could not find container \"2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1\": container with ID starting with 2f9ff510cf6f1e639e54e3b1b1ab9f0f2ae82aced8a7e3c16dc973cc4f981cf1 not found: ID does not exist" Apr 16 18:43:48.995981 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.995958 2567 scope.go:117] "RemoveContainer" containerID="fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f" Apr 16 18:43:48.996192 ip-10-0-141-189 kubenswrapper[2567]: E0416 18:43:48.996170 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f\": container with ID starting with fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f not found: ID does not exist" containerID="fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f" Apr 16 18:43:48.996239 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:48.996199 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f"} err="failed to get container status \"fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f\": rpc error: code = NotFound desc = could not find container \"fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f\": container with ID starting with fad47f50b7078d5275b9ef8143d0283a0c1e519a940f0ce823948d64f5bd8e6f not found: ID does not exist" Apr 16 18:43:49.310599 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:49.310494 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" path="/var/lib/kubelet/pods/324efcca-ef3c-4fea-82ac-6083acfd6160/volumes" Apr 16 18:43:50.537243 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:50.537203 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bbjnh_67dd972d-b8b2-4a9e-b970-30a3c3e75fac/manager/0.log" Apr 16 18:43:50.546255 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:50.546234 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-pvznl_1ac5f7a6-ea18-44b2-bcc7-9d234001848b/limitador/0.log" Apr 16 18:43:53.035081 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035046 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qmqrm/must-gather-ht4qx"] Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035389 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6195eefa-8be8-4477-bda2-6585d515ce75" containerName="main" Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035400 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6195eefa-8be8-4477-bda2-6585d515ce75" containerName="main" Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035413 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="storage-initializer" Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035420 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="storage-initializer" Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035432 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6195eefa-8be8-4477-bda2-6585d515ce75" containerName="storage-initializer" Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035437 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6195eefa-8be8-4477-bda2-6585d515ce75" containerName="storage-initializer" Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035444 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="storage-initializer" Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035450 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="storage-initializer" Apr 16 18:43:53.035454 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035458 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="main" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035464 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="main" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035478 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="tokenizer" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035483 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="tokenizer" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035489 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="llm-d-routing-sidecar" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035494 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="llm-d-routing-sidecar" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035499 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035505 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035552 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="main" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035581 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6195eefa-8be8-4477-bda2-6585d515ce75" containerName="main" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035590 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="main" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035600 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a921433d-5350-4a29-9aeb-6401bcaf1513" containerName="tokenizer" Apr 16 18:43:53.035775 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.035611 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="324efcca-ef3c-4fea-82ac-6083acfd6160" containerName="llm-d-routing-sidecar" Apr 16 18:43:53.040863 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.040843 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmqrm/must-gather-ht4qx" Apr 16 18:43:53.044205 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.044181 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qmqrm\"/\"kube-root-ca.crt\"" Apr 16 18:43:53.044355 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.044224 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qmqrm\"/\"default-dockercfg-5hdfg\"" Apr 16 18:43:53.044355 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.044181 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qmqrm\"/\"openshift-service-ca.crt\"" Apr 16 18:43:53.045639 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.045614 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmqrm/must-gather-ht4qx"] Apr 16 18:43:53.200954 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.200917 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb0fa3ca-d87e-4c48-8445-18dadff2f5c5-must-gather-output\") pod \"must-gather-ht4qx\" (UID: \"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5\") " pod="openshift-must-gather-qmqrm/must-gather-ht4qx" Apr 16 18:43:53.201137 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.200988 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2qd\" (UniqueName: \"kubernetes.io/projected/bb0fa3ca-d87e-4c48-8445-18dadff2f5c5-kube-api-access-sh2qd\") pod \"must-gather-ht4qx\" (UID: \"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5\") " pod="openshift-must-gather-qmqrm/must-gather-ht4qx" Apr 16 18:43:53.301789 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.301715 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2qd\" (UniqueName: \"kubernetes.io/projected/bb0fa3ca-d87e-4c48-8445-18dadff2f5c5-kube-api-access-sh2qd\") pod \"must-gather-ht4qx\" (UID: \"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5\") " pod="openshift-must-gather-qmqrm/must-gather-ht4qx" Apr 16 18:43:53.301789 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.301790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb0fa3ca-d87e-4c48-8445-18dadff2f5c5-must-gather-output\") pod \"must-gather-ht4qx\" (UID: \"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5\") " pod="openshift-must-gather-qmqrm/must-gather-ht4qx" Apr 16 18:43:53.302071 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.302055 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb0fa3ca-d87e-4c48-8445-18dadff2f5c5-must-gather-output\") pod \"must-gather-ht4qx\" (UID: \"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5\") " pod="openshift-must-gather-qmqrm/must-gather-ht4qx" Apr 16 18:43:53.309812 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.309783 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2qd\" (UniqueName: \"kubernetes.io/projected/bb0fa3ca-d87e-4c48-8445-18dadff2f5c5-kube-api-access-sh2qd\") pod \"must-gather-ht4qx\" (UID: \"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5\") " pod="openshift-must-gather-qmqrm/must-gather-ht4qx" Apr 16 18:43:53.351018 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.350990 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmqrm/must-gather-ht4qx" Apr 16 18:43:53.467546 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.467527 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmqrm/must-gather-ht4qx"] Apr 16 18:43:53.469691 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:43:53.469659 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb0fa3ca_d87e_4c48_8445_18dadff2f5c5.slice/crio-1c82867c3d62a3b736c6359c8a48276426455a3d956dc9d42fc9c5b8fd394cb0 WatchSource:0}: Error finding container 1c82867c3d62a3b736c6359c8a48276426455a3d956dc9d42fc9c5b8fd394cb0: Status 404 returned error can't find the container with id 1c82867c3d62a3b736c6359c8a48276426455a3d956dc9d42fc9c5b8fd394cb0 Apr 16 18:43:53.471385 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.471368 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:43:53.975731 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:53.975693 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/must-gather-ht4qx" event={"ID":"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5","Type":"ContainerStarted","Data":"1c82867c3d62a3b736c6359c8a48276426455a3d956dc9d42fc9c5b8fd394cb0"} Apr 16 18:43:54.982016 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:54.981975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/must-gather-ht4qx" event={"ID":"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5","Type":"ContainerStarted","Data":"a9c0c5d928466ea9ed13a970568751effc7180617ffed6456c81bc12d99d656a"} Apr 16 18:43:54.982016 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:54.982020 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/must-gather-ht4qx" event={"ID":"bb0fa3ca-d87e-4c48-8445-18dadff2f5c5","Type":"ContainerStarted","Data":"30dd2cb2695c51c745ae8c9b86e345d544f5143e305947a243c4300646f5cf04"} Apr 16 18:43:54.999477 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:54.999316 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qmqrm/must-gather-ht4qx" podStartSLOduration=1.109840818 podStartE2EDuration="1.999298082s" podCreationTimestamp="2026-04-16 18:43:53 +0000 UTC" firstStartedPulling="2026-04-16 18:43:53.471521578 +0000 UTC m=+2052.675873385" lastFinishedPulling="2026-04-16 18:43:54.360978844 +0000 UTC m=+2053.565330649" observedRunningTime="2026-04-16 18:43:54.996395158 +0000 UTC m=+2054.200746985" watchObservedRunningTime="2026-04-16 18:43:54.999298082 +0000 UTC m=+2054.203649910" Apr 16 18:43:55.982470 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:55.982443 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-z58pc_c2bb0d16-0207-47e2-ad4a-862fbaea345c/global-pull-secret-syncer/0.log" Apr 16 18:43:56.050710 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:56.050683 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ptp6j_bb1eecb7-d44f-4eaa-8a18-4e159e76feaa/konnectivity-agent/0.log" Apr 16 18:43:56.106667 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:43:56.106634 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-189.ec2.internal_2c187cd16f95e6d697a69aca59443285/haproxy/0.log" Apr 16 18:44:00.495822 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:00.495778 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bbjnh_67dd972d-b8b2-4a9e-b970-30a3c3e75fac/manager/0.log" Apr 16 18:44:00.534613 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:00.534524 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-pvznl_1ac5f7a6-ea18-44b2-bcc7-9d234001848b/limitador/0.log" Apr 16 18:44:01.532830 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.532803 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_188bb96a-e2ed-4695-ab48-9de2bfac7427/alertmanager/0.log" Apr 16 18:44:01.561858 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.561810 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_188bb96a-e2ed-4695-ab48-9de2bfac7427/config-reloader/0.log" Apr 16 18:44:01.588842 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.588816 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_188bb96a-e2ed-4695-ab48-9de2bfac7427/kube-rbac-proxy-web/0.log" Apr 16 18:44:01.620852 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.620820 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_188bb96a-e2ed-4695-ab48-9de2bfac7427/kube-rbac-proxy/0.log" Apr 16 18:44:01.645236 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.645210 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_188bb96a-e2ed-4695-ab48-9de2bfac7427/kube-rbac-proxy-metric/0.log" Apr 16 18:44:01.670402 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.670378 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_188bb96a-e2ed-4695-ab48-9de2bfac7427/prom-label-proxy/0.log" Apr 16 18:44:01.695850 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.695814 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_188bb96a-e2ed-4695-ab48-9de2bfac7427/init-config-reloader/0.log" Apr 16 18:44:01.843225 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.843200 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-fcl87_fd9f939d-a19f-43a9-aba5-792530142e5a/kube-state-metrics/0.log" Apr 16 18:44:01.869386 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.869348 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-fcl87_fd9f939d-a19f-43a9-aba5-792530142e5a/kube-rbac-proxy-main/0.log" Apr 16 18:44:01.894416 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.894389 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-fcl87_fd9f939d-a19f-43a9-aba5-792530142e5a/kube-rbac-proxy-self/0.log" Apr 16 18:44:01.940690 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:01.940663 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-66c678675d-p6mc8_08380e44-f4bb-4d91-b88f-becbbd96a33d/metrics-server/0.log" Apr 16 18:44:02.202111 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.202029 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-znszh_f9a13485-2c15-41ed-bd77-dcd33a714804/node-exporter/0.log" Apr 16 18:44:02.243359 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.243328 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-znszh_f9a13485-2c15-41ed-bd77-dcd33a714804/kube-rbac-proxy/0.log" Apr 16 18:44:02.269880 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.269848 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-znszh_f9a13485-2c15-41ed-bd77-dcd33a714804/init-textfile/0.log" Apr 16 18:44:02.412514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.412479 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_881d1e93-5753-43ad-9c2a-d449bf97eb14/prometheus/0.log" Apr 16 18:44:02.434897 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.434867 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_881d1e93-5753-43ad-9c2a-d449bf97eb14/config-reloader/0.log" Apr 16 18:44:02.459693 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.459610 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_881d1e93-5753-43ad-9c2a-d449bf97eb14/thanos-sidecar/0.log" Apr 16 18:44:02.483809 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.483776 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_881d1e93-5753-43ad-9c2a-d449bf97eb14/kube-rbac-proxy-web/0.log" Apr 16 18:44:02.508907 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.508874 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_881d1e93-5753-43ad-9c2a-d449bf97eb14/kube-rbac-proxy/0.log" Apr 16 18:44:02.539257 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.539226 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_881d1e93-5753-43ad-9c2a-d449bf97eb14/kube-rbac-proxy-thanos/0.log" Apr 16 18:44:02.563952 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.563922 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_881d1e93-5753-43ad-9c2a-d449bf97eb14/init-config-reloader/0.log" Apr 16 18:44:02.659917 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.659883 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-p9d8k_5541e9a7-264e-46c1-96dd-157bccc85ffe/prometheus-operator-admission-webhook/0.log" Apr 16 18:44:02.695002 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.694965 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bd779bd54-vndd7_3be674fa-942d-48cb-bea1-2b59304bf983/telemeter-client/0.log" Apr 16 18:44:02.725824 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.725750 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bd779bd54-vndd7_3be674fa-942d-48cb-bea1-2b59304bf983/reload/0.log" Apr 16 18:44:02.750833 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:02.750798 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bd779bd54-vndd7_3be674fa-942d-48cb-bea1-2b59304bf983/kube-rbac-proxy/0.log" Apr 16 18:44:04.471842 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.471802 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp"] Apr 16 18:44:04.479641 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.479615 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.483357 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.483323 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp"] Apr 16 18:44:04.511057 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.511028 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-proc\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.511211 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.511096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttfcq\" (UniqueName: \"kubernetes.io/projected/1fd918ac-8a15-499f-a762-b4f7000a1edc-kube-api-access-ttfcq\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.511211 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.511136 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-podres\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.511304 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.511229 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-sys\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.511304 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.511270 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-lib-modules\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612633 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-proc\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612633 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612637 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttfcq\" (UniqueName: \"kubernetes.io/projected/1fd918ac-8a15-499f-a762-b4f7000a1edc-kube-api-access-ttfcq\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-podres\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-sys\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612732 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-lib-modules\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612738 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-proc\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612852 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-lib-modules\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612834 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-sys\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.612879 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.612867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1fd918ac-8a15-499f-a762-b4f7000a1edc-podres\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.620941 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.620917 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttfcq\" (UniqueName: \"kubernetes.io/projected/1fd918ac-8a15-499f-a762-b4f7000a1edc-kube-api-access-ttfcq\") pod \"perf-node-gather-daemonset-fhmsp\" (UID: \"1fd918ac-8a15-499f-a762-b4f7000a1edc\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.799150 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.799061 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:04.920326 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.920296 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-pkr4q_2e637d56-5f0b-48b7-8dac-7c0dca151d31/download-server/0.log" Apr 16 18:44:04.931376 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:04.931356 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp"] Apr 16 18:44:04.933738 ip-10-0-141-189 kubenswrapper[2567]: W0416 18:44:04.933660 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1fd918ac_8a15_499f_a762_b4f7000a1edc.slice/crio-d146eb4428d69ccde8b30e7f52d855af413c0c4c46f38ae64de563c04613adad WatchSource:0}: Error finding container d146eb4428d69ccde8b30e7f52d855af413c0c4c46f38ae64de563c04613adad: Status 404 returned error can't find the container with id d146eb4428d69ccde8b30e7f52d855af413c0c4c46f38ae64de563c04613adad Apr 16 18:44:05.027358 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:05.027308 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" event={"ID":"1fd918ac-8a15-499f-a762-b4f7000a1edc","Type":"ContainerStarted","Data":"43743168c57676ec369c2a95bdf7fc3935e9b458dccc5952b2240309b3c73e50"} Apr 16 18:44:05.027466 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:05.027364 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" event={"ID":"1fd918ac-8a15-499f-a762-b4f7000a1edc","Type":"ContainerStarted","Data":"d146eb4428d69ccde8b30e7f52d855af413c0c4c46f38ae64de563c04613adad"} Apr 16 18:44:05.027466 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:05.027434 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:05.043612 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:05.043552 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" podStartSLOduration=1.04353698 podStartE2EDuration="1.04353698s" podCreationTimestamp="2026-04-16 18:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:05.041768232 +0000 UTC m=+2064.246120078" watchObservedRunningTime="2026-04-16 18:44:05.04353698 +0000 UTC m=+2064.247888810" Apr 16 18:44:06.148614 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:06.148549 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lp6zw_05e20dd4-40f5-487a-b14b-4e0d5b971aeb/dns/0.log" Apr 16 18:44:06.173832 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:06.173805 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lp6zw_05e20dd4-40f5-487a-b14b-4e0d5b971aeb/kube-rbac-proxy/0.log" Apr 16 18:44:06.300408 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:06.300380 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lc2jd_edb0e160-57f6-4631-88ff-4d22d6b51543/dns-node-resolver/0.log" Apr 16 18:44:06.802779 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:06.802754 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-87pfd_767a8b0d-4ec4-41ae-a78e-8d1bc4c8cc92/node-ca/0.log" Apr 16 18:44:08.235865 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:08.235830 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vvblr_040d516b-7ed2-4aa4-b0b4-bc6131bac5cf/serve-healthcheck-canary/0.log" Apr 16 18:44:08.759178 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:08.759151 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qn9c9_59c9a688-bbaa-4127-917e-ea403858f0a6/kube-rbac-proxy/0.log" Apr 16 18:44:08.782514 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:08.782486 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qn9c9_59c9a688-bbaa-4127-917e-ea403858f0a6/exporter/0.log" Apr 16 18:44:08.806223 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:08.806206 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qn9c9_59c9a688-bbaa-4127-917e-ea403858f0a6/extractor/0.log" Apr 16 18:44:11.040437 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:11.040408 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-fhmsp" Apr 16 18:44:11.346843 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:11.346821 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-65bdb464b4-t68ct_78238f47-b733-40e0-843a-76b695784854/manager/0.log" Apr 16 18:44:12.005979 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:12.005949 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6c5b6b4855-bsvqv_7f3eb740-370e-4df5-a971-d328bf1a09c4/manager/0.log" Apr 16 18:44:12.241906 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:12.241881 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-wq7x2_006ba9e6-35c5-4e2e-b6b9-fa60ace64d7a/s3-init/0.log" Apr 16 18:44:18.614889 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:18.614858 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cp9f_b1985221-eb0c-4868-b1f9-55585c1796dc/kube-multus/0.log" Apr 16 18:44:18.822598 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:18.822569 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5mwl_c4bda706-5aa8-4750-aa51-1fd47724ec81/kube-multus-additional-cni-plugins/0.log" Apr 16 18:44:18.846656 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:18.846634 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5mwl_c4bda706-5aa8-4750-aa51-1fd47724ec81/egress-router-binary-copy/0.log" Apr 16 18:44:18.874760 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:18.874701 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5mwl_c4bda706-5aa8-4750-aa51-1fd47724ec81/cni-plugins/0.log" Apr 16 18:44:18.897080 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:18.897060 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5mwl_c4bda706-5aa8-4750-aa51-1fd47724ec81/bond-cni-plugin/0.log" Apr 16 18:44:18.919778 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:18.919758 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5mwl_c4bda706-5aa8-4750-aa51-1fd47724ec81/routeoverride-cni/0.log" Apr 16 18:44:18.942721 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:18.942705 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5mwl_c4bda706-5aa8-4750-aa51-1fd47724ec81/whereabouts-cni-bincopy/0.log" Apr 16 18:44:18.965261 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:18.965239 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5mwl_c4bda706-5aa8-4750-aa51-1fd47724ec81/whereabouts-cni/0.log" Apr 16 18:44:19.232142 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:19.232069 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bbpzd_d4d545a6-5b19-4165-9bd6-f5c19acf145a/network-metrics-daemon/0.log" Apr 16 18:44:19.264227 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:19.264208 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bbpzd_d4d545a6-5b19-4165-9bd6-f5c19acf145a/kube-rbac-proxy/0.log" Apr 16 18:44:20.203393 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:20.203361 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htm65_9bd668bd-f016-4d96-a5ed-1376b7d9e8ec/ovn-controller/0.log" Apr 16 18:44:20.233872 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:20.233846 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htm65_9bd668bd-f016-4d96-a5ed-1376b7d9e8ec/ovn-acl-logging/0.log" Apr 16 18:44:20.254425 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:20.254404 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htm65_9bd668bd-f016-4d96-a5ed-1376b7d9e8ec/kube-rbac-proxy-node/0.log" Apr 16 18:44:20.277406 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:20.277386 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htm65_9bd668bd-f016-4d96-a5ed-1376b7d9e8ec/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:44:20.299480 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:20.299461 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htm65_9bd668bd-f016-4d96-a5ed-1376b7d9e8ec/northd/0.log" Apr 16 18:44:20.322445 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:20.322422 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htm65_9bd668bd-f016-4d96-a5ed-1376b7d9e8ec/nbdb/0.log" Apr 16 18:44:20.346048 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:20.346026 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htm65_9bd668bd-f016-4d96-a5ed-1376b7d9e8ec/sbdb/0.log" Apr 16 18:44:20.472129 ip-10-0-141-189 kubenswrapper[2567]: I0416 18:44:20.472107 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htm65_9bd668bd-f016-4d96-a5ed-1376b7d9e8ec/ovnkube-controller/0.log"