Apr 16 16:00:43.725757 ip-10-0-140-78 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:00:44.307180 ip-10-0-140-78 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:00:44.307180 ip-10-0-140-78 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:00:44.307180 ip-10-0-140-78 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:00:44.307180 ip-10-0-140-78 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:00:44.307180 ip-10-0-140-78 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:00:44.308397 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.308257 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:00:44.311339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311323 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:00:44.311339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311340 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311343 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311347 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311349 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311352 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311355 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311357 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311360 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311364 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311368 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311371 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311374 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311383 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311387 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311389 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311392 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311394 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311397 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311400 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:00:44.311403 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311403 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311406 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311409 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311412 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311414 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311417 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311420 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311422 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311425 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311427 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311429 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311432 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311434 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311437 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311440 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311442 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311462 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311466 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311470 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:00:44.311864 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311485 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311488 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311491 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311493 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311496 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311500 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311504 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311507 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311510 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311513 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311515 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311518 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311521 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311523 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311526 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311528 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311531 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311534 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311537 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:00:44.312397 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311540 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311542 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311545 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311547 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311550 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311553 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311556 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311559 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311561 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311568 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311571 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311574 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311577 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311579 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311581 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311584 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311586 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311590 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311593 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311595 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:00:44.312871 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311598 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311600 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311603 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311605 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311608 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311610 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311613 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.311615 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312029 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312034 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312037 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312040 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312043 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312045 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312049 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312053 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312055 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312058 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312060 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312064 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:00:44.313343 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312066 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312069 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312072 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312074 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312077 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312079 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312082 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312084 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312087 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312089 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312092 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312094 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312096 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312099 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312101 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312104 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312106 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312108 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312111 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312113 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:00:44.313843 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312116 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312119 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312122 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312124 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312127 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312129 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312132 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312134 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312137 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312140 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312142 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312145 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312147 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312150 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312153 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312155 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312158 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312160 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312163 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312165 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:00:44.314338 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312167 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312170 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312174 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312177 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312180 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312183 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312186 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312189 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312191 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312194 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312197 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312199 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312202 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312206 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312209 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312211 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312214 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312216 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312219 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:00:44.314874 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312221 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312223 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312226 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312228 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312231 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312233 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312236 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312238 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312241 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312244 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312246 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312249 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312251 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312253 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.312256 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314221 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314233 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314240 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314245 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314250 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314253 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:00:44.315339 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314258 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314262 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314265 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314268 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314272 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314277 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314280 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314283 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314286 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314289 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314292 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314294 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314297 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314301 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314304 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314307 2577 flags.go:64] FLAG: --config-dir="" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314310 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314313 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314317 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314320 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314331 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314335 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314338 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314341 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:00:44.315852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314344 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314347 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314350 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314354 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314357 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314360 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314363 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314366 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314369 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314376 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314379 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314382 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314385 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314388 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314393 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314396 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314398 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314402 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314404 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314407 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314410 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314412 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314415 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314418 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314421 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 16:00:44.316526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314424 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314428 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314431 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314434 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314443 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314457 2577 flags.go:64] FLAG: --help="false" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314460 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314463 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314466 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314469 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314472 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314476 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314479 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314482 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314485 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314487 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314490 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314493 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314496 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314499 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314502 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314505 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314507 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314510 2577 flags.go:64] FLAG: --lock-file="" Apr 16 16:00:44.317127 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314513 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314516 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314519 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314524 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314527 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314530 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314533 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314535 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314538 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314541 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314544 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314548 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314551 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314561 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314564 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314567 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314571 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314573 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314576 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314579 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314582 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314589 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314592 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314595 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:00:44.317720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314598 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314601 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314607 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314609 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314612 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314615 2577 flags.go:64] FLAG: --port="10250" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314618 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314621 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0759265e06e2660a2" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314624 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314627 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314630 2577 flags.go:64] FLAG: --register-node="true" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314632 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314635 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314639 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314642 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314644 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314647 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314651 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314654 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314656 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314659 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314662 2577 flags.go:64] FLAG: --runonce="false" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314666 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314669 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314672 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:00:44.318284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314675 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314678 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314681 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314684 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314687 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314689 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314692 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314695 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314697 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314700 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314703 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314706 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314711 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314714 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314716 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314723 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314725 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314728 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314730 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314733 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314736 2577 flags.go:64] FLAG: --v="2" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314740 2577 flags.go:64] FLAG: --version="false" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314744 2577 flags.go:64] FLAG: --vmodule="" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314748 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.314751 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:00:44.318886 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314883 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314887 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314889 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314893 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314896 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314899 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314905 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314908 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314911 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314914 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314916 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314919 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314922 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314924 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314927 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314929 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314932 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314934 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314937 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314940 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:00:44.319512 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314942 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314946 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314949 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314952 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314954 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314957 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314959 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314962 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314965 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314967 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314970 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314972 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314975 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314977 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314980 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314983 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314985 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314988 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314992 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:00:44.320042 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314995 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.314997 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315000 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315003 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315005 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315008 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315010 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315013 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315015 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315018 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315020 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315023 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315025 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315028 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315030 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315033 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315036 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315038 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315041 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315043 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:00:44.320546 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315046 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315048 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315051 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315055 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315058 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315061 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315064 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315067 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315070 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315072 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315075 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315079 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315082 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315085 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315088 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315090 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315093 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315095 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315098 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315100 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:00:44.321030 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315103 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:00:44.321557 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315105 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:00:44.321557 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315108 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:00:44.321557 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315110 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:00:44.321557 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315113 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:00:44.321557 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315115 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:00:44.321557 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.315118 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:00:44.321557 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.315865 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:00:44.323901 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.323814 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:00:44.324058 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.324046 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:00:44.324125 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324116 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:00:44.324125 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324124 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324128 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324132 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324135 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324138 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324141 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324144 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324146 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324149 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324152 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324154 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324157 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324160 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324162 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324164 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324167 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324170 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324172 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324175 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324177 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:00:44.324182 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324179 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324182 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324185 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324187 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324190 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324193 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324196 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324198 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324202 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324206 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324212 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324215 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324217 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324220 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324222 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324225 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324227 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324230 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324232 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:00:44.324702 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324235 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324237 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324240 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324242 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324244 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324247 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324249 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324252 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324254 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324256 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324259 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324261 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324264 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324266 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324268 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324272 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324274 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324277 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324279 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324281 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:00:44.325174 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324284 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324286 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324289 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324292 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324294 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324296 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324299 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324301 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324303 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324306 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324308 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324310 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324313 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324317 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324321 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324324 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324327 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324329 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324332 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:00:44.325667 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324334 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324337 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324340 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324342 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324345 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324347 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324349 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.324355 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324470 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324477 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324480 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324484 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324486 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324489 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324492 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324495 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:00:44.326161 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324498 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324500 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324503 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324505 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324508 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324511 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324513 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324516 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324519 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324521 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324523 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324526 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324528 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324530 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324534 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324536 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324539 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324541 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324544 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:00:44.326568 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324547 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324549 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324552 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324554 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324557 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324561 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324565 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324567 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324570 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324573 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324576 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324578 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324581 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324583 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324586 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324588 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324591 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324593 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324596 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:00:44.327032 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324598 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324600 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324603 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324605 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324608 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324610 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324613 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324617 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324620 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324623 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324626 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324629 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324631 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324634 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324636 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324639 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324641 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324644 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324646 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324649 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:00:44.327500 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324651 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324654 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324656 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324659 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324661 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324663 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324666 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324668 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324671 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324673 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324676 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324678 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324680 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324683 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324685 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324687 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324690 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324692 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324694 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:00:44.327965 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:44.324697 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:00:44.328413 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.324702 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:00:44.328413 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.325598 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:00:44.328719 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.328706 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:00:44.329856 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.329845 2577 server.go:1019] "Starting client certificate rotation" Apr 16 16:00:44.329961 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.329944 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:00:44.329999 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.329989 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:00:44.370929 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.370914 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:00:44.373791 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.373774 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:00:44.392035 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.392016 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:00:44.397931 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.397917 2577 log.go:25] "Validated CRI v1 image API" Apr 16 16:00:44.400074 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.400057 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:00:44.404756 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.404737 2577 fs.go:135] Filesystem UUIDs: map[177b06d6-c2e8-4cf4-855a-364629e70970:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a25e5da2-1722-4896-a4d5-7a96f69e6428:/dev/nvme0n1p3] Apr 16 16:00:44.404803 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.404757 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:00:44.408112 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.408095 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:00:44.412616 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.412496 2577 manager.go:217] Machine: {Timestamp:2026-04-16 16:00:44.41038954 +0000 UTC m=+0.531047470 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101499 MemoryCapacity:32812179456 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20a28d50d3091575cb2134606df3c8 SystemUUID:ec20a28d-50d3-0915-75cb-2134606df3c8 BootID:57fe4501-d0dc-4c7b-969b-8fc8ee147321 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406089728 Type:vfs Inodes:4005393 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562439168 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ff:12:03:fc:bf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ff:12:03:fc:bf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:ee:f7:7c:ba:7a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812179456 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:00:44.412616 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.412606 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:00:44.412737 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.412687 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:00:44.413084 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.413062 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:00:44.413214 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.413086 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-78.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:00:44.413262 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.413221 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:00:44.413262 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.413230 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:00:44.413262 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.413243 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:00:44.414163 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.414153 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:00:44.416042 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.416032 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:00:44.416149 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.416140 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:00:44.419003 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.418994 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:00:44.419041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.419007 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:00:44.419041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.419018 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:00:44.419041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.419026 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:00:44.419041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.419035 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:00:44.420207 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.420195 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:00:44.420259 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.420215 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:00:44.423872 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.423856 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:00:44.425848 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.425835 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:00:44.427979 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.427966 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.427995 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428004 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428009 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428015 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428020 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428025 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428031 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428037 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:00:44.428040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428043 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:00:44.428272 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428052 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:00:44.428272 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.428061 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:00:44.429066 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.429056 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:00:44.429066 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.429066 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:00:44.432981 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.432966 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:00:44.433041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.433001 2577 server.go:1295] "Started kubelet" Apr 16 16:00:44.433136 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.433101 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:00:44.433223 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.433128 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:00:44.433269 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.433257 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:00:44.433831 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.433793 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-78.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:00:44.433885 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.433852 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-78.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:00:44.433885 ip-10-0-140-78 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:00:44.434020 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.433985 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:00:44.434363 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.434348 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:00:44.436415 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.436402 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:00:44.440500 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.440479 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:00:44.441166 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.441146 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:00:44.442095 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.440479 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-78.ec2.internal.18a6e1aeeb54748e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-78.ec2.internal,UID:ip-10-0-140-78.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-78.ec2.internal,},FirstTimestamp:2026-04-16 16:00:44.432979086 +0000 UTC m=+0.553637016,LastTimestamp:2026-04-16 16:00:44.432979086 +0000 UTC m=+0.553637016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-78.ec2.internal,}" Apr 16 16:00:44.442523 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.442476 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:00:44.442717 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.442704 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:00:44.442782 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.442722 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:00:44.443489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.442866 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:00:44.443489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.442879 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:00:44.443489 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.442900 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:44.443489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.443379 2577 factory.go:153] Registering CRI-O factory Apr 16 16:00:44.443489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.443430 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 16:00:44.443768 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.443512 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:00:44.443768 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.443525 2577 factory.go:55] Registering systemd factory Apr 16 16:00:44.443768 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.443532 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:00:44.443768 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.443554 2577 factory.go:103] Registering Raw factory Apr 16 16:00:44.443768 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.443569 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 16:00:44.444007 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.443829 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:00:44.445087 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.445072 2577 manager.go:319] Starting recovery of all containers Apr 16 16:00:44.449048 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.449019 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-78.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:00:44.449175 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.449152 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:00:44.456078 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.456060 2577 manager.go:324] Recovery completed Apr 16 16:00:44.458981 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.458818 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wgjfb" Apr 16 16:00:44.460344 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.460333 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:00:44.463670 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.463655 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:00:44.463743 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.463682 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:00:44.463743 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.463697 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:00:44.464268 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.464251 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:00:44.464268 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.464268 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:00:44.464386 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.464286 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:00:44.465514 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.465427 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-78.ec2.internal.18a6e1aeed28c26b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-78.ec2.internal,UID:ip-10-0-140-78.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-78.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-78.ec2.internal,},FirstTimestamp:2026-04-16 16:00:44.463669867 +0000 UTC m=+0.584327796,LastTimestamp:2026-04-16 16:00:44.463669867 +0000 UTC m=+0.584327796,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-78.ec2.internal,}" Apr 16 16:00:44.466553 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.466539 2577 policy_none.go:49] "None policy: Start" Apr 16 16:00:44.466623 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.466560 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:00:44.466623 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.466573 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:00:44.466913 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.466898 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wgjfb" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.510627 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.510700 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.510714 2577 server.go:85] "Starting device plugin registration server" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.510979 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.510993 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.511057 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.511142 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.511151 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.511742 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:00:44.516157 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.511783 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:44.543373 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.543340 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:00:44.544671 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.544652 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:00:44.544773 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.544680 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:00:44.544773 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.544698 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:00:44.544773 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.544705 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:00:44.544773 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.544739 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:00:44.549254 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.549235 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:00:44.611164 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.611082 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:00:44.612018 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.612002 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:00:44.612093 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.612030 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:00:44.612093 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.612041 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:00:44.612093 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.612071 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.621267 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.621253 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.621312 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.621274 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-78.ec2.internal\": node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:44.644611 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.644593 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:44.645597 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.645570 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal"] Apr 16 16:00:44.645692 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.645654 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:00:44.647217 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.647199 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:00:44.647304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.647227 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:00:44.647304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.647241 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:00:44.648579 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.648567 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:00:44.648731 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.648716 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.648774 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.648747 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:00:44.649200 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.649186 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:00:44.649265 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.649199 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:00:44.649265 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.649212 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:00:44.649265 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.649218 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:00:44.649265 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.649227 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:00:44.649265 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.649233 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:00:44.650851 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.650837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.650902 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.650867 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:00:44.651557 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.651542 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:00:44.651617 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.651571 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:00:44.651617 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.651585 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:00:44.670003 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.669984 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-78.ec2.internal\" not found" node="ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.673245 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.673230 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-78.ec2.internal\" not found" node="ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.744110 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.744091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16cda4de244fe8d0f7938f19308551ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal\" (UID: \"16cda4de244fe8d0f7938f19308551ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.744221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.744124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16cda4de244fe8d0f7938f19308551ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal\" (UID: \"16cda4de244fe8d0f7938f19308551ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.744221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.744141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e885656e91016de60540dee8de7f05c-config\") pod \"kube-apiserver-proxy-ip-10-0-140-78.ec2.internal\" (UID: \"2e885656e91016de60540dee8de7f05c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.745107 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.745093 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:44.844746 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.844712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e885656e91016de60540dee8de7f05c-config\") pod \"kube-apiserver-proxy-ip-10-0-140-78.ec2.internal\" (UID: \"2e885656e91016de60540dee8de7f05c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.844866 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.844753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16cda4de244fe8d0f7938f19308551ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal\" (UID: \"16cda4de244fe8d0f7938f19308551ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.844866 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.844776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16cda4de244fe8d0f7938f19308551ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal\" (UID: \"16cda4de244fe8d0f7938f19308551ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.844866 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.844827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e885656e91016de60540dee8de7f05c-config\") pod \"kube-apiserver-proxy-ip-10-0-140-78.ec2.internal\" (UID: \"2e885656e91016de60540dee8de7f05c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.844866 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.844831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16cda4de244fe8d0f7938f19308551ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal\" (UID: \"16cda4de244fe8d0f7938f19308551ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.845004 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.844829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16cda4de244fe8d0f7938f19308551ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal\" (UID: \"16cda4de244fe8d0f7938f19308551ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.845743 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.845722 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:44.946631 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:44.946560 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:44.973775 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.973732 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:44.976541 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:44.976525 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" Apr 16 16:00:45.046853 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.046810 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:45.147361 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.147321 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:45.247830 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.247749 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-78.ec2.internal\" not found" Apr 16 16:00:45.278307 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.278275 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:00:45.330053 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.330020 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:00:45.330510 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.330172 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:00:45.330510 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.330210 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:00:45.342189 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.342168 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" Apr 16 16:00:45.365593 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.365568 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:00:45.369058 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.369044 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" Apr 16 16:00:45.377740 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.377724 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:00:45.419938 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.419912 2577 apiserver.go:52] "Watching apiserver" Apr 16 16:00:45.427284 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.427262 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:00:45.428314 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.428284 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-v84j7","kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal","openshift-cluster-node-tuning-operator/tuned-kpt9z","openshift-image-registry/node-ca-lz8kp","openshift-multus/multus-additional-cni-plugins-qmfsc","openshift-multus/multus-z72rs","openshift-network-operator/iptables-alerter-wc9sm","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal","openshift-multus/network-metrics-daemon-nwbqd","openshift-network-diagnostics/network-check-target-lc585","openshift-ovn-kubernetes/ovnkube-node-grb4s"] Apr 16 16:00:45.431433 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.431411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:00:45.432560 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.432534 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.432658 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.432616 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.433735 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.433715 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.434247 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.434229 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:00:45.434343 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.434260 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:00:45.434343 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.434273 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-66kbc\"" Apr 16 16:00:45.435214 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.435198 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:00:45.435376 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.435359 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:00:45.435474 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.435410 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:00:45.435536 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.435475 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5wrm7\"" Apr 16 16:00:45.435536 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.435479 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:00:45.435536 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.435502 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5hqq4\"" Apr 16 16:00:45.435683 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.435589 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:00:45.435841 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.435827 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:00:45.436019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.436005 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:00:45.436019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.436015 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:00:45.436281 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.436245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.436365 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.436248 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:00:45.436751 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.436734 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:00:45.436835 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.436776 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9tznm\"" Apr 16 16:00:45.437574 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.437559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.438855 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.438568 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q22dv\"" Apr 16 16:00:45.438855 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.438570 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:00:45.438855 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.438639 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.438855 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.438748 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:45.439022 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.438865 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:00:45.439763 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.439749 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:00:45.439840 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.439802 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:00:45.439885 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.439843 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:45.439926 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.439906 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:00:45.439991 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.439947 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:00:45.440027 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.439999 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7xt72\"" Apr 16 16:00:45.440592 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.440575 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:00:45.441075 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.441059 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:00:45.441248 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.441219 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:00:45.441321 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.441286 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.441382 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.441321 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9lfkb\"" Apr 16 16:00:45.441584 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.441548 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:00:45.443995 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.443854 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:00:45.443995 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.443871 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9d7ct\"" Apr 16 16:00:45.443995 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.443870 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:00:45.443995 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.443925 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:00:45.443995 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.443876 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:00:45.444875 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.444856 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:00:45.444979 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.444960 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:00:45.445194 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.445178 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:00:45.447554 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-lib-modules\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.447629 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447560 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d975b4a-cf06-476b-bfc4-6f887bd4940d-tmp\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.447629 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:45.447629 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-kubelet\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.447629 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-run-ovn-kubernetes\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.447629 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovnkube-script-lib\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.447825 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gdt9\" (UniqueName: \"kubernetes.io/projected/803316ee-89ca-4413-8b50-f8e2771aae54-kube-api-access-7gdt9\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.447825 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447714 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:45.447825 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-slash\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.447825 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-cni-bin\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.448000 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-systemd\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.448000 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-system-cni-dir\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.448000 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2df5883a-3545-46d2-90ae-bd313fcad285-iptables-alerter-script\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.448000 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdn48\" (UniqueName: \"kubernetes.io/projected/b0890f1f-a03c-4336-8b33-5daaa65706b0-kube-api-access-kdn48\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.447938 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b98433d-7f8e-4ded-9eb4-1696faebcffe-konnectivity-ca\") pod \"konnectivity-agent-v84j7\" (UID: \"6b98433d-7f8e-4ded-9eb4-1696faebcffe\") " pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-cni-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-hostroot\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-sys-fs\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448496 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysctl-conf\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-run\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brd5c\" (UniqueName: \"kubernetes.io/projected/dc41db5b-cc00-4f67-be36-85fcec39776e-kube-api-access-brd5c\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-log-socket\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-netns\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-daemon-config\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-cni-multus\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-etc-kubernetes\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-socket-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448883 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-modprobe-d\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysconfig\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.448975 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vsdb\" (UniqueName: \"kubernetes.io/projected/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-kube-api-access-6vsdb\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.450571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2df5883a-3545-46d2-90ae-bd313fcad285-host-slash\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-cni-binary-copy\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-socket-dir-parent\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-cni-bin\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovnkube-config\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-env-overrides\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449298 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-conf-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449326 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-multus-certs\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-kubernetes\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-cnibin\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-systemd-units\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449474 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-serviceca\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-k8s-cni-cncf-io\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5nss\" (UniqueName: \"kubernetes.io/projected/86d0be28-1879-425a-a135-e8471798e397-kube-api-access-b5nss\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.451369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-os-release\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b98433d-7f8e-4ded-9eb4-1696faebcffe-agent-certs\") pod \"konnectivity-agent-v84j7\" (UID: \"6b98433d-7f8e-4ded-9eb4-1696faebcffe\") " pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-device-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-etc-selinux\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-var-lib-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-ovn\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-system-cni-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449926 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-run-netns\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.449975 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-etc-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovn-node-metrics-cert\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-host\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-cnibin\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-os-release\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-kubelet\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj659\" (UniqueName: \"kubernetes.io/projected/2df5883a-3545-46d2-90ae-bd313fcad285-kube-api-access-nj659\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-systemd\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452019 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbk4q\" (UniqueName: \"kubernetes.io/projected/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-kube-api-access-cbk4q\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-var-lib-kubelet\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-host\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450433 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnlr\" (UniqueName: \"kubernetes.io/projected/8d975b4a-cf06-476b-bfc4-6f887bd4940d-kube-api-access-sbnlr\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450567 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450707 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-sys\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-node-log\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-cni-netd\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysctl-d\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.450971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-tuned\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.452547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.451006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-registration-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.454231 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.454212 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:00:45.468881 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.468849 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:55:44 +0000 UTC" deadline="2028-01-31 21:11:42.42378984 +0000 UTC" Apr 16 16:00:45.468881 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.468879 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15725h10m56.954912803s" Apr 16 16:00:45.484064 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.484047 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gjfzq" Apr 16 16:00:45.494913 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.494845 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:00:45.495367 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.495352 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gjfzq" Apr 16 16:00:45.506537 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.506486 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e885656e91016de60540dee8de7f05c.slice/crio-cabdbc8a7864480d209e0f8b2907eddc8bb452da4223b2f7bcfe270a606184b6 WatchSource:0}: Error finding container cabdbc8a7864480d209e0f8b2907eddc8bb452da4223b2f7bcfe270a606184b6: Status 404 returned error can't find the container with id cabdbc8a7864480d209e0f8b2907eddc8bb452da4223b2f7bcfe270a606184b6 Apr 16 16:00:45.506795 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.506776 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16cda4de244fe8d0f7938f19308551ad.slice/crio-46555775ff379fcdb68cdb59107a3028a17af295a4dd483728bc9dbf2921624a WatchSource:0}: Error finding container 46555775ff379fcdb68cdb59107a3028a17af295a4dd483728bc9dbf2921624a: Status 404 returned error can't find the container with id 46555775ff379fcdb68cdb59107a3028a17af295a4dd483728bc9dbf2921624a Apr 16 16:00:45.514827 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.514813 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:00:45.547760 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.547717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" event={"ID":"16cda4de244fe8d0f7938f19308551ad","Type":"ContainerStarted","Data":"46555775ff379fcdb68cdb59107a3028a17af295a4dd483728bc9dbf2921624a"} Apr 16 16:00:45.548598 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.548572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" event={"ID":"2e885656e91016de60540dee8de7f05c","Type":"ContainerStarted","Data":"cabdbc8a7864480d209e0f8b2907eddc8bb452da4223b2f7bcfe270a606184b6"} Apr 16 16:00:45.551801 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.551786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-registration-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.551864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.551810 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-lib-modules\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.551864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.551826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d975b4a-cf06-476b-bfc4-6f887bd4940d-tmp\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.551864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.551843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:45.551973 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.551955 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-kubelet\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552011 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.551989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-run-ovn-kubernetes\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552011 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.551986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-lib-modules\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.552011 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovnkube-script-lib\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552152 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gdt9\" (UniqueName: \"kubernetes.io/projected/803316ee-89ca-4413-8b50-f8e2771aae54-kube-api-access-7gdt9\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.552152 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-kubelet\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552152 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:45.552152 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-registration-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-run-ovn-kubernetes\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-slash\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.552235 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-cni-bin\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-slash\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-systemd\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552231 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-system-cni-dir\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.552369 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2df5883a-3545-46d2-90ae-bd313fcad285-iptables-alerter-script\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-systemd\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.552385 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs podName:dc41db5b-cc00-4f67-be36-85fcec39776e nodeName:}" failed. No retries permitted until 2026-04-16 16:00:46.052356823 +0000 UTC m=+2.173014758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs") pod "network-metrics-daemon-nwbqd" (UID: "dc41db5b-cc00-4f67-be36-85fcec39776e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdn48\" (UniqueName: \"kubernetes.io/projected/b0890f1f-a03c-4336-8b33-5daaa65706b0-kube-api-access-kdn48\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-system-cni-dir\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b98433d-7f8e-4ded-9eb4-1696faebcffe-konnectivity-ca\") pod \"konnectivity-agent-v84j7\" (UID: \"6b98433d-7f8e-4ded-9eb4-1696faebcffe\") " pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-cni-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-hostroot\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-sys-fs\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.552829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysctl-conf\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-run\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brd5c\" (UniqueName: \"kubernetes.io/projected/dc41db5b-cc00-4f67-be36-85fcec39776e-kube-api-access-brd5c\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-cni-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-log-socket\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-netns\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552950 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-sys-fs\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovnkube-script-lib\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-daemon-config\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b98433d-7f8e-4ded-9eb4-1696faebcffe-konnectivity-ca\") pod \"konnectivity-agent-v84j7\" (UID: \"6b98433d-7f8e-4ded-9eb4-1696faebcffe\") " pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.552991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-hostroot\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-cni-multus\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-log-socket\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-cni-bin\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-netns\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-run\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysctl-conf\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-etc-kubernetes\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.553277 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-cni-multus\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-socket-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-etc-kubernetes\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-modprobe-d\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysconfig\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-socket-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-modprobe-d\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vsdb\" (UniqueName: \"kubernetes.io/projected/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-kube-api-access-6vsdb\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysconfig\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2df5883a-3545-46d2-90ae-bd313fcad285-host-slash\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2df5883a-3545-46d2-90ae-bd313fcad285-host-slash\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-cni-binary-copy\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-socket-dir-parent\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-cni-bin\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovnkube-config\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-daemon-config\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-env-overrides\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-cni-bin\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-conf-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553747 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-socket-dir-parent\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-multus-certs\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-kubernetes\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-cnibin\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-systemd-units\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2df5883a-3545-46d2-90ae-bd313fcad285-iptables-alerter-script\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-serviceca\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-kubernetes\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-cnibin\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554005 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-systemd-units\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554043 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-multus-conf-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.553961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-k8s-cni-cncf-io\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554107 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5nss\" (UniqueName: \"kubernetes.io/projected/86d0be28-1879-425a-a135-e8471798e397-kube-api-access-b5nss\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-os-release\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.554727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-cni-binary-copy\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b98433d-7f8e-4ded-9eb4-1696faebcffe-agent-certs\") pod \"konnectivity-agent-v84j7\" (UID: \"6b98433d-7f8e-4ded-9eb4-1696faebcffe\") " pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554252 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-env-overrides\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-device-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovnkube-config\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-k8s-cni-cncf-io\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-etc-selinux\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-run-multus-certs\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-var-lib-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-etc-selinux\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-ovn\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-serviceca\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-os-release\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-device-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-system-cni-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.555517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554665 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-ovn\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-system-cni-dir\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-run-netns\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-var-lib-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-etc-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554738 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86d0be28-1879-425a-a135-e8471798e397-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554775 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-run-netns\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovn-node-metrics-cert\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-host\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-etc-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-cnibin\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.554990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-host\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-os-release\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555058 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-cnibin\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-kubelet\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555193 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-os-release\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555232 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-host-var-lib-kubelet\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.556178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj659\" (UniqueName: \"kubernetes.io/projected/2df5883a-3545-46d2-90ae-bd313fcad285-kube-api-access-nj659\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-systemd\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbk4q\" (UniqueName: \"kubernetes.io/projected/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-kube-api-access-cbk4q\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-systemd\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-var-lib-kubelet\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-host\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnlr\" (UniqueName: \"kubernetes.io/projected/8d975b4a-cf06-476b-bfc4-6f887bd4940d-kube-api-access-sbnlr\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-host\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-var-lib-kubelet\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-sys\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-node-log\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555977 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d975b4a-cf06-476b-bfc4-6f887bd4940d-tmp\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-sys\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.557037 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.555982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-cni-netd\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.556021 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-run-openvswitch\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.556033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysctl-d\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.556059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-tuned\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.556065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-host-cni-netd\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.556035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0890f1f-a03c-4336-8b33-5daaa65706b0-node-log\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.556136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-sysctl-d\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.556271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.556803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/803316ee-89ca-4413-8b50-f8e2771aae54-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.557271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0890f1f-a03c-4336-8b33-5daaa65706b0-ovn-node-metrics-cert\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.558020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.557486 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b98433d-7f8e-4ded-9eb4-1696faebcffe-agent-certs\") pod \"konnectivity-agent-v84j7\" (UID: \"6b98433d-7f8e-4ded-9eb4-1696faebcffe\") " pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:00:45.558890 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.558850 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8d975b4a-cf06-476b-bfc4-6f887bd4940d-etc-tuned\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.559311 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.559294 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:00:45.559359 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.559348 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:00:45.559435 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.559410 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dkczl for pod openshift-network-diagnostics/network-check-target-lc585: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:45.559576 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:45.559560 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl podName:1c981ccc-1389-4ae2-9023-1e308ba5dbb1 nodeName:}" failed. No retries permitted until 2026-04-16 16:00:46.059539585 +0000 UTC m=+2.180197520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dkczl" (UniqueName: "kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl") pod "network-check-target-lc585" (UID: "1c981ccc-1389-4ae2-9023-1e308ba5dbb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:45.559650 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.559631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/803316ee-89ca-4413-8b50-f8e2771aae54-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.563304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.563277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vsdb\" (UniqueName: \"kubernetes.io/projected/72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f-kube-api-access-6vsdb\") pod \"node-ca-lz8kp\" (UID: \"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f\") " pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.563304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.563291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdn48\" (UniqueName: \"kubernetes.io/projected/b0890f1f-a03c-4336-8b33-5daaa65706b0-kube-api-access-kdn48\") pod \"ovnkube-node-grb4s\" (UID: \"b0890f1f-a03c-4336-8b33-5daaa65706b0\") " pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.563483 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.563365 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5nss\" (UniqueName: \"kubernetes.io/projected/86d0be28-1879-425a-a135-e8471798e397-kube-api-access-b5nss\") pod \"aws-ebs-csi-driver-node-zdztx\" (UID: \"86d0be28-1879-425a-a135-e8471798e397\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.563483 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.563400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gdt9\" (UniqueName: \"kubernetes.io/projected/803316ee-89ca-4413-8b50-f8e2771aae54-kube-api-access-7gdt9\") pod \"multus-additional-cni-plugins-qmfsc\" (UID: \"803316ee-89ca-4413-8b50-f8e2771aae54\") " pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.563784 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.563766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brd5c\" (UniqueName: \"kubernetes.io/projected/dc41db5b-cc00-4f67-be36-85fcec39776e-kube-api-access-brd5c\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:45.563945 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.563930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbk4q\" (UniqueName: \"kubernetes.io/projected/4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2-kube-api-access-cbk4q\") pod \"multus-z72rs\" (UID: \"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2\") " pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.564038 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.564024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj659\" (UniqueName: \"kubernetes.io/projected/2df5883a-3545-46d2-90ae-bd313fcad285-kube-api-access-nj659\") pod \"iptables-alerter-wc9sm\" (UID: \"2df5883a-3545-46d2-90ae-bd313fcad285\") " pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.564140 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.564125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnlr\" (UniqueName: \"kubernetes.io/projected/8d975b4a-cf06-476b-bfc4-6f887bd4940d-kube-api-access-sbnlr\") pod \"tuned-kpt9z\" (UID: \"8d975b4a-cf06-476b-bfc4-6f887bd4940d\") " pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.759767 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.759686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:00:45.766351 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.766322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" Apr 16 16:00:45.766581 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.766562 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b98433d_7f8e_4ded_9eb4_1696faebcffe.slice/crio-31ba9c26bd8a6e98e8c6dd42d4e899eb12d375e4e40fc474cdad383c5d3c1efa WatchSource:0}: Error finding container 31ba9c26bd8a6e98e8c6dd42d4e899eb12d375e4e40fc474cdad383c5d3c1efa: Status 404 returned error can't find the container with id 31ba9c26bd8a6e98e8c6dd42d4e899eb12d375e4e40fc474cdad383c5d3c1efa Apr 16 16:00:45.772357 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.772333 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d975b4a_cf06_476b_bfc4_6f887bd4940d.slice/crio-838ef7b38d19b4388c2ac97142b60715ce24bdbb5d3dffe26ee470c8a8fca933 WatchSource:0}: Error finding container 838ef7b38d19b4388c2ac97142b60715ce24bdbb5d3dffe26ee470c8a8fca933: Status 404 returned error can't find the container with id 838ef7b38d19b4388c2ac97142b60715ce24bdbb5d3dffe26ee470c8a8fca933 Apr 16 16:00:45.783110 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.783089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lz8kp" Apr 16 16:00:45.787612 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.787595 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" Apr 16 16:00:45.789177 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.789155 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72eac7d7_86dd_4e53_a6fb_3b9ea7c2b96f.slice/crio-944e86dfcb186dbf1938db20fc76d1f335049251cacb604436695bf332a16dd2 WatchSource:0}: Error finding container 944e86dfcb186dbf1938db20fc76d1f335049251cacb604436695bf332a16dd2: Status 404 returned error can't find the container with id 944e86dfcb186dbf1938db20fc76d1f335049251cacb604436695bf332a16dd2 Apr 16 16:00:45.794819 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.794796 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod803316ee_89ca_4413_8b50_f8e2771aae54.slice/crio-2fc4cd0156096cb6272ac2446c23cd8620dbb48e11b85101c69641d66cb1e321 WatchSource:0}: Error finding container 2fc4cd0156096cb6272ac2446c23cd8620dbb48e11b85101c69641d66cb1e321: Status 404 returned error can't find the container with id 2fc4cd0156096cb6272ac2446c23cd8620dbb48e11b85101c69641d66cb1e321 Apr 16 16:00:45.812033 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.812008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z72rs" Apr 16 16:00:45.818012 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.817988 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f04a36e_4a07_4d11_aa7a_7a9cfb6156c2.slice/crio-428d1d6f70c15c9d893240e24d5682c9363415e5057d9786984b9737bfa9d224 WatchSource:0}: Error finding container 428d1d6f70c15c9d893240e24d5682c9363415e5057d9786984b9737bfa9d224: Status 404 returned error can't find the container with id 428d1d6f70c15c9d893240e24d5682c9363415e5057d9786984b9737bfa9d224 Apr 16 16:00:45.819319 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.819301 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wc9sm" Apr 16 16:00:45.825430 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.825407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" Apr 16 16:00:45.825637 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.825613 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df5883a_3545_46d2_90ae_bd313fcad285.slice/crio-96d43222c06f12b3dfba16c53f005e85e5c9622e110c3ec161ca0dbfef91252a WatchSource:0}: Error finding container 96d43222c06f12b3dfba16c53f005e85e5c9622e110c3ec161ca0dbfef91252a: Status 404 returned error can't find the container with id 96d43222c06f12b3dfba16c53f005e85e5c9622e110c3ec161ca0dbfef91252a Apr 16 16:00:45.828850 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.828834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:00:45.832679 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.832659 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d0be28_1879_425a_a135_e8471798e397.slice/crio-e893945760ee7697fa50cfb1f3592035e49457d17eca2ab79c477324ca036d13 WatchSource:0}: Error finding container e893945760ee7697fa50cfb1f3592035e49457d17eca2ab79c477324ca036d13: Status 404 returned error can't find the container with id e893945760ee7697fa50cfb1f3592035e49457d17eca2ab79c477324ca036d13 Apr 16 16:00:45.839402 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:00:45.839380 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0890f1f_a03c_4336_8b33_5daaa65706b0.slice/crio-f312fc00433ba4f1ead290750246961f89c84f7f3c1fd5a670a2272a1e005b30 WatchSource:0}: Error finding container f312fc00433ba4f1ead290750246961f89c84f7f3c1fd5a670a2272a1e005b30: Status 404 returned error can't find the container with id f312fc00433ba4f1ead290750246961f89c84f7f3c1fd5a670a2272a1e005b30 Apr 16 16:00:45.949292 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:45.949254 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:00:46.058609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.058520 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:46.058760 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:46.058659 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:46.058760 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:46.058728 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs podName:dc41db5b-cc00-4f67-be36-85fcec39776e nodeName:}" failed. No retries permitted until 2026-04-16 16:00:47.058710611 +0000 UTC m=+3.179368545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs") pod "network-metrics-daemon-nwbqd" (UID: "dc41db5b-cc00-4f67-be36-85fcec39776e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:46.159691 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.159653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:46.159857 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:46.159836 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:00:46.159857 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:46.159855 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:00:46.159962 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:46.159868 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dkczl for pod openshift-network-diagnostics/network-check-target-lc585: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:46.159962 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:46.159954 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl podName:1c981ccc-1389-4ae2-9023-1e308ba5dbb1 nodeName:}" failed. No retries permitted until 2026-04-16 16:00:47.159934255 +0000 UTC m=+3.280592188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dkczl" (UniqueName: "kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl") pod "network-check-target-lc585" (UID: "1c981ccc-1389-4ae2-9023-1e308ba5dbb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:46.496354 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.496269 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:55:45 +0000 UTC" deadline="2027-10-04 21:17:05.184855585 +0000 UTC" Apr 16 16:00:46.496354 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.496310 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12869h16m18.688550745s" Apr 16 16:00:46.565340 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.565293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" event={"ID":"86d0be28-1879-425a-a135-e8471798e397","Type":"ContainerStarted","Data":"e893945760ee7697fa50cfb1f3592035e49457d17eca2ab79c477324ca036d13"} Apr 16 16:00:46.571609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.571571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z72rs" event={"ID":"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2","Type":"ContainerStarted","Data":"428d1d6f70c15c9d893240e24d5682c9363415e5057d9786984b9737bfa9d224"} Apr 16 16:00:46.592335 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.592289 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" event={"ID":"8d975b4a-cf06-476b-bfc4-6f887bd4940d","Type":"ContainerStarted","Data":"838ef7b38d19b4388c2ac97142b60715ce24bdbb5d3dffe26ee470c8a8fca933"} Apr 16 16:00:46.602490 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.602442 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v84j7" event={"ID":"6b98433d-7f8e-4ded-9eb4-1696faebcffe","Type":"ContainerStarted","Data":"31ba9c26bd8a6e98e8c6dd42d4e899eb12d375e4e40fc474cdad383c5d3c1efa"} Apr 16 16:00:46.612228 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.612201 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"f312fc00433ba4f1ead290750246961f89c84f7f3c1fd5a670a2272a1e005b30"} Apr 16 16:00:46.631423 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.631390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wc9sm" event={"ID":"2df5883a-3545-46d2-90ae-bd313fcad285","Type":"ContainerStarted","Data":"96d43222c06f12b3dfba16c53f005e85e5c9622e110c3ec161ca0dbfef91252a"} Apr 16 16:00:46.635489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.635431 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" event={"ID":"803316ee-89ca-4413-8b50-f8e2771aae54","Type":"ContainerStarted","Data":"2fc4cd0156096cb6272ac2446c23cd8620dbb48e11b85101c69641d66cb1e321"} Apr 16 16:00:46.637164 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.637141 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:00:46.639605 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:46.639578 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lz8kp" event={"ID":"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f","Type":"ContainerStarted","Data":"944e86dfcb186dbf1938db20fc76d1f335049251cacb604436695bf332a16dd2"} Apr 16 16:00:47.067315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:47.067273 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:47.067525 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:47.067436 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:47.067598 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:47.067526 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs podName:dc41db5b-cc00-4f67-be36-85fcec39776e nodeName:}" failed. No retries permitted until 2026-04-16 16:00:49.067505425 +0000 UTC m=+5.188163346 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs") pod "network-metrics-daemon-nwbqd" (UID: "dc41db5b-cc00-4f67-be36-85fcec39776e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:47.168557 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:47.168519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:47.168771 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:47.168732 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:00:47.168771 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:47.168755 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:00:47.168771 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:47.168768 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dkczl for pod openshift-network-diagnostics/network-check-target-lc585: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:47.168959 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:47.168824 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl podName:1c981ccc-1389-4ae2-9023-1e308ba5dbb1 nodeName:}" failed. No retries permitted until 2026-04-16 16:00:49.168805933 +0000 UTC m=+5.289463857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dkczl" (UniqueName: "kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl") pod "network-check-target-lc585" (UID: "1c981ccc-1389-4ae2-9023-1e308ba5dbb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:47.312027 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:47.311983 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:00:47.496635 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:47.496578 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:55:45 +0000 UTC" deadline="2027-11-13 22:22:38.83293841 +0000 UTC" Apr 16 16:00:47.496635 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:47.496614 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13830h21m51.336327976s" Apr 16 16:00:47.545236 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:47.545205 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:47.545430 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:47.545339 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:00:47.545810 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:47.545790 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:47.545914 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:47.545894 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:00:49.086582 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:49.086548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:49.087030 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:49.086808 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:49.087030 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:49.086898 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs podName:dc41db5b-cc00-4f67-be36-85fcec39776e nodeName:}" failed. No retries permitted until 2026-04-16 16:00:53.086859384 +0000 UTC m=+9.207517626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs") pod "network-metrics-daemon-nwbqd" (UID: "dc41db5b-cc00-4f67-be36-85fcec39776e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:49.187639 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:49.187604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:49.187780 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:49.187755 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:00:49.187780 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:49.187773 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:00:49.187892 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:49.187786 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dkczl for pod openshift-network-diagnostics/network-check-target-lc585: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:49.187892 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:49.187848 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl podName:1c981ccc-1389-4ae2-9023-1e308ba5dbb1 nodeName:}" failed. No retries permitted until 2026-04-16 16:00:53.187828755 +0000 UTC m=+9.308486678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dkczl" (UniqueName: "kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl") pod "network-check-target-lc585" (UID: "1c981ccc-1389-4ae2-9023-1e308ba5dbb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:49.545174 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:49.545096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:49.545324 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:49.545220 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:00:49.545705 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:49.545685 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:49.545834 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:49.545785 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:00:51.546496 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:51.545810 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:51.546496 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:51.545934 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:00:51.546496 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:51.546321 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:51.546496 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:51.546421 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:00:53.121644 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:53.121595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:53.122186 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:53.121758 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:53.122186 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:53.121841 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs podName:dc41db5b-cc00-4f67-be36-85fcec39776e nodeName:}" failed. No retries permitted until 2026-04-16 16:01:01.121821684 +0000 UTC m=+17.242479606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs") pod "network-metrics-daemon-nwbqd" (UID: "dc41db5b-cc00-4f67-be36-85fcec39776e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:00:53.222008 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:53.221978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:53.222187 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:53.222160 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:00:53.222187 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:53.222178 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:00:53.222289 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:53.222190 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dkczl for pod openshift-network-diagnostics/network-check-target-lc585: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:53.222289 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:53.222242 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl podName:1c981ccc-1389-4ae2-9023-1e308ba5dbb1 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:01.222222948 +0000 UTC m=+17.342880879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dkczl" (UniqueName: "kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl") pod "network-check-target-lc585" (UID: "1c981ccc-1389-4ae2-9023-1e308ba5dbb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:00:53.545142 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:53.545062 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:53.545342 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:53.545192 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:00:53.545658 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:53.545631 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:53.545786 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:53.545750 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:00:55.545642 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:55.545608 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:55.545973 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:55.545619 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:55.545973 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:55.545723 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:00:55.545973 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:55.545794 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:00:57.545765 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:57.545725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:57.546155 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:57.545725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:57.546155 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:57.545859 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:00:57.546155 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:57.545908 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:00:59.545820 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:59.545784 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:00:59.546245 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:00:59.545787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:00:59.546245 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:59.545887 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:00:59.546245 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:00:59.546020 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:01.178000 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:01.177954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:01.178493 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:01.178116 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:01.178493 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:01.178183 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs podName:dc41db5b-cc00-4f67-be36-85fcec39776e nodeName:}" failed. No retries permitted until 2026-04-16 16:01:17.178166134 +0000 UTC m=+33.298824055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs") pod "network-metrics-daemon-nwbqd" (UID: "dc41db5b-cc00-4f67-be36-85fcec39776e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:01.278256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:01.278224 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:01.278424 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:01.278374 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:01:01.278424 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:01.278389 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:01:01.278424 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:01.278398 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dkczl for pod openshift-network-diagnostics/network-check-target-lc585: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:01.278577 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:01.278470 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl podName:1c981ccc-1389-4ae2-9023-1e308ba5dbb1 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:17.278435643 +0000 UTC m=+33.399093579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dkczl" (UniqueName: "kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl") pod "network-check-target-lc585" (UID: "1c981ccc-1389-4ae2-9023-1e308ba5dbb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:01.545217 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:01.545138 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:01.545217 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:01.545154 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:01.545412 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:01.545248 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:01.545412 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:01.545395 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:03.545323 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.545147 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:03.545666 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.545251 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:03.545666 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:03.545439 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:03.545666 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:03.545551 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:03.676395 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.676370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" event={"ID":"2e885656e91016de60540dee8de7f05c","Type":"ContainerStarted","Data":"3ef1a50bd0dfd3fde1ec9562fa781bbdc5960bd3f5d5cb0d6ce450e0dffed082"} Apr 16 16:01:03.679836 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.679804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z72rs" event={"ID":"4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2","Type":"ContainerStarted","Data":"3ec1bb5f2e8ce3f9deffa74764642340d686117af96cc2da774724569a4bd76a"} Apr 16 16:01:03.681988 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.681964 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" event={"ID":"8d975b4a-cf06-476b-bfc4-6f887bd4940d","Type":"ContainerStarted","Data":"847b731d9e44d7ac0a8230178239375aafa4e455aa5850719f2208272299ced2"} Apr 16 16:01:03.683808 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.683783 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"212c5e33fbcf977ea7c263512584e40e8573de7f2e20c4e9467413d5f0750be1"} Apr 16 16:01:03.683878 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.683816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"0cd919481f175adf21585b97a3d1ee178b87190ccec6f50f96b8dbe94c162a75"} Apr 16 16:01:03.689882 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.689837 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-78.ec2.internal" podStartSLOduration=18.689826001 podStartE2EDuration="18.689826001s" podCreationTimestamp="2026-04-16 16:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:03.689494574 +0000 UTC m=+19.810152512" watchObservedRunningTime="2026-04-16 16:01:03.689826001 +0000 UTC m=+19.810483939" Apr 16 16:01:03.724040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:03.724006 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z72rs" podStartSLOduration=2.076957949 podStartE2EDuration="19.723993386s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:00:45.819466809 +0000 UTC m=+1.940124725" lastFinishedPulling="2026-04-16 16:01:03.466502241 +0000 UTC m=+19.587160162" observedRunningTime="2026-04-16 16:01:03.706389284 +0000 UTC m=+19.827047223" watchObservedRunningTime="2026-04-16 16:01:03.723993386 +0000 UTC m=+19.844651324" Apr 16 16:01:04.314564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.314525 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kpt9z" podStartSLOduration=2.639839174 podStartE2EDuration="20.314507874s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:00:45.773796657 +0000 UTC m=+1.894454578" lastFinishedPulling="2026-04-16 16:01:03.448465347 +0000 UTC m=+19.569123278" observedRunningTime="2026-04-16 16:01:03.723844571 +0000 UTC m=+19.844502522" watchObservedRunningTime="2026-04-16 16:01:04.314507874 +0000 UTC m=+20.435165814" Apr 16 16:01:04.314694 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.314682 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rbq9r"] Apr 16 16:01:04.316550 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.316531 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.319297 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.319279 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:01:04.319370 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.319283 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:01:04.319536 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.319521 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sxb74\"" Apr 16 16:01:04.398633 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.398596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-hosts-file\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.398776 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.398644 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsx5c\" (UniqueName: \"kubernetes.io/projected/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-kube-api-access-xsx5c\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.398776 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.398730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-tmp-dir\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.499065 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.499043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-tmp-dir\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.499213 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.499075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-hosts-file\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.499213 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.499099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xsx5c\" (UniqueName: \"kubernetes.io/projected/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-kube-api-access-xsx5c\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.499213 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.499189 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-hosts-file\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.499376 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.499361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-tmp-dir\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.512671 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.512644 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsx5c\" (UniqueName: \"kubernetes.io/projected/3ef774fd-1deb-4665-96a2-b7fe3ffcc315-kube-api-access-xsx5c\") pod \"node-resolver-rbq9r\" (UID: \"3ef774fd-1deb-4665-96a2-b7fe3ffcc315\") " pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.666728 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.666666 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rbq9r" Apr 16 16:01:04.673618 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:04.673496 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ef774fd_1deb_4665_96a2_b7fe3ffcc315.slice/crio-9bd21cea20362324ba8898511ee66f41d99d9c78fd5aa60a815b9fa608cb80f1 WatchSource:0}: Error finding container 9bd21cea20362324ba8898511ee66f41d99d9c78fd5aa60a815b9fa608cb80f1: Status 404 returned error can't find the container with id 9bd21cea20362324ba8898511ee66f41d99d9c78fd5aa60a815b9fa608cb80f1 Apr 16 16:01:04.685827 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.685806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lz8kp" event={"ID":"72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f","Type":"ContainerStarted","Data":"fad565cfcecf0b012757b12efe0fb10ec6cf0cf05067308ba4746e805be9c212"} Apr 16 16:01:04.686758 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.686738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rbq9r" event={"ID":"3ef774fd-1deb-4665-96a2-b7fe3ffcc315","Type":"ContainerStarted","Data":"9bd21cea20362324ba8898511ee66f41d99d9c78fd5aa60a815b9fa608cb80f1"} Apr 16 16:01:04.688132 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.688112 2577 generic.go:358] "Generic (PLEG): container finished" podID="16cda4de244fe8d0f7938f19308551ad" containerID="f47889448520ec42d7a6b91d665326dd3659ac3cda6c2cd080edb7632665eec4" exitCode=0 Apr 16 16:01:04.688224 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.688169 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" event={"ID":"16cda4de244fe8d0f7938f19308551ad","Type":"ContainerDied","Data":"f47889448520ec42d7a6b91d665326dd3659ac3cda6c2cd080edb7632665eec4"} Apr 16 16:01:04.689502 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.689476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" event={"ID":"86d0be28-1879-425a-a135-e8471798e397","Type":"ContainerStarted","Data":"9034f752515290c288f8c4e17b2d5a83da41d3ca3d5e4cbd29641a9e6ba0ffea"} Apr 16 16:01:04.690792 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.690767 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v84j7" event={"ID":"6b98433d-7f8e-4ded-9eb4-1696faebcffe","Type":"ContainerStarted","Data":"19db791748510a2bb9f233b31af4671e990232993e0ddaa077c77e62d30850c6"} Apr 16 16:01:04.694545 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.694529 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:01:04.694892 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.694864 2577 generic.go:358] "Generic (PLEG): container finished" podID="b0890f1f-a03c-4336-8b33-5daaa65706b0" containerID="212c5e33fbcf977ea7c263512584e40e8573de7f2e20c4e9467413d5f0750be1" exitCode=1 Apr 16 16:01:04.694999 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.694929 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerDied","Data":"212c5e33fbcf977ea7c263512584e40e8573de7f2e20c4e9467413d5f0750be1"} Apr 16 16:01:04.694999 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.694946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"bb76088bccc8df6b783f5ad28192da9105bcffddb56d47d63b7e773f67a80e5a"} Apr 16 16:01:04.694999 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.694955 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"9f336d57e83682b0a352d8f15bf8e11d936aa10438d2d9135d111c9718d52b48"} Apr 16 16:01:04.694999 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.694964 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"b6a66f78229281d7997342cc55aa8af2ba6dbabf5f4700948b31318d2b99e1bc"} Apr 16 16:01:04.694999 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.694972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"ba93055b3f9b3875a05628ad0672c340d680b8ff6b4fb1218b09dc226b444196"} Apr 16 16:01:04.698884 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.698856 2577 generic.go:358] "Generic (PLEG): container finished" podID="803316ee-89ca-4413-8b50-f8e2771aae54" containerID="4f2908644cb9ba4d5b608aff1960b6e5d2ed7a5f13554eea4a0d69bceaa8f6dc" exitCode=0 Apr 16 16:01:04.699027 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.698990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" event={"ID":"803316ee-89ca-4413-8b50-f8e2771aae54","Type":"ContainerDied","Data":"4f2908644cb9ba4d5b608aff1960b6e5d2ed7a5f13554eea4a0d69bceaa8f6dc"} Apr 16 16:01:04.705223 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.700440 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lz8kp" podStartSLOduration=3.045310817 podStartE2EDuration="20.700428061s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:00:45.791512059 +0000 UTC m=+1.912169982" lastFinishedPulling="2026-04-16 16:01:03.446629297 +0000 UTC m=+19.567287226" observedRunningTime="2026-04-16 16:01:04.700210793 +0000 UTC m=+20.820868734" watchObservedRunningTime="2026-04-16 16:01:04.700428061 +0000 UTC m=+20.821086001" Apr 16 16:01:04.741572 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:04.741531 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-v84j7" podStartSLOduration=3.062559454 podStartE2EDuration="20.741518611s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:00:45.769188116 +0000 UTC m=+1.889846036" lastFinishedPulling="2026-04-16 16:01:03.448147272 +0000 UTC m=+19.568805193" observedRunningTime="2026-04-16 16:01:04.741475141 +0000 UTC m=+20.862133080" watchObservedRunningTime="2026-04-16 16:01:04.741518611 +0000 UTC m=+20.862176550" Apr 16 16:01:05.545401 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.545381 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:05.545508 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:05.545490 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:05.545574 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.545563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:05.545641 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:05.545627 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:05.548948 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.548927 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:01:05.702565 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.702480 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rbq9r" event={"ID":"3ef774fd-1deb-4665-96a2-b7fe3ffcc315","Type":"ContainerStarted","Data":"e77820113adfabe2c8e0cf66529ac9931c345b530540c2134c8daf355106ad1c"} Apr 16 16:01:05.704294 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.704263 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" event={"ID":"16cda4de244fe8d0f7938f19308551ad","Type":"ContainerStarted","Data":"99c1664d9f0b9ca64610e50d9409b439fb54514efd56c534c89ec9187279d827"} Apr 16 16:01:05.705992 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.705913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" event={"ID":"86d0be28-1879-425a-a135-e8471798e397","Type":"ContainerStarted","Data":"c846aa045365afe4bcf3e848d3babf3d6d3d7d33b570e708938da5772ef4ed18"} Apr 16 16:01:05.707221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.707197 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wc9sm" event={"ID":"2df5883a-3545-46d2-90ae-bd313fcad285","Type":"ContainerStarted","Data":"b17b48433b38a5b14fba5faec65183c3eeeded56d0774799f87eace359f44bc6"} Apr 16 16:01:05.716093 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.716052 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rbq9r" podStartSLOduration=1.7160399640000001 podStartE2EDuration="1.716039964s" podCreationTimestamp="2026-04-16 16:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:05.715586185 +0000 UTC m=+21.836244125" watchObservedRunningTime="2026-04-16 16:01:05.716039964 +0000 UTC m=+21.836697903" Apr 16 16:01:05.727818 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.727779 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-78.ec2.internal" podStartSLOduration=20.727765629 podStartE2EDuration="20.727765629s" podCreationTimestamp="2026-04-16 16:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:05.727710157 +0000 UTC m=+21.848368109" watchObservedRunningTime="2026-04-16 16:01:05.727765629 +0000 UTC m=+21.848423567" Apr 16 16:01:05.739406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:05.739358 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wc9sm" podStartSLOduration=4.157784829 podStartE2EDuration="21.739341742s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:00:45.828504836 +0000 UTC m=+1.949162755" lastFinishedPulling="2026-04-16 16:01:03.41006175 +0000 UTC m=+19.530719668" observedRunningTime="2026-04-16 16:01:05.739244912 +0000 UTC m=+21.859902855" watchObservedRunningTime="2026-04-16 16:01:05.739341742 +0000 UTC m=+21.859999684" Apr 16 16:01:06.530039 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:06.529932 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:01:05.548944959Z","UUID":"128de657-3d95-4638-9de7-90f9ffec7650","Handler":null,"Name":"","Endpoint":""} Apr 16 16:01:06.531560 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:06.531535 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:01:06.531560 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:06.531564 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:01:06.712410 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:06.712385 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:01:06.712922 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:06.712884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"f23504d0566e8f2d883f126485c746ec36271a75b12b639fd274455fc3c432a6"} Apr 16 16:01:07.545637 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:07.545611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:07.545637 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:07.545625 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:07.545842 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:07.545725 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:07.545906 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:07.545856 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:07.717114 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:07.717036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" event={"ID":"86d0be28-1879-425a-a135-e8471798e397","Type":"ContainerStarted","Data":"9953547cd9bfaf492b5b6b9063089eb4bd8359ed1b505098c62049a8cfd693db"} Apr 16 16:01:07.744926 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:07.744875 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdztx" podStartSLOduration=2.535075901 podStartE2EDuration="23.744859621s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:00:45.834638636 +0000 UTC m=+1.955296555" lastFinishedPulling="2026-04-16 16:01:07.044422353 +0000 UTC m=+23.165080275" observedRunningTime="2026-04-16 16:01:07.744410584 +0000 UTC m=+23.865068527" watchObservedRunningTime="2026-04-16 16:01:07.744859621 +0000 UTC m=+23.865517559" Apr 16 16:01:08.644622 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:08.644583 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:01:08.645325 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:08.645296 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:01:08.913034 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:08.912863 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:01:08.913756 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:08.913735 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-v84j7" Apr 16 16:01:09.545932 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.545879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:09.546016 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.545948 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:09.546077 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:09.546059 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:09.546200 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:09.546178 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:09.723400 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.723372 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:01:09.723748 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.723709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"b68d977301107b50f830698dc589ae4866dee9dbec768e13f484f08eff1fd84d"} Apr 16 16:01:09.724098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.724080 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:01:09.724250 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.724230 2577 scope.go:117] "RemoveContainer" containerID="212c5e33fbcf977ea7c263512584e40e8573de7f2e20c4e9467413d5f0750be1" Apr 16 16:01:09.725420 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.725396 2577 generic.go:358] "Generic (PLEG): container finished" podID="803316ee-89ca-4413-8b50-f8e2771aae54" containerID="057980f68b5db3aa9600a8ba1914fd249983641db7d5e3db21242bd8b6f53238" exitCode=0 Apr 16 16:01:09.725534 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.725496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" event={"ID":"803316ee-89ca-4413-8b50-f8e2771aae54","Type":"ContainerDied","Data":"057980f68b5db3aa9600a8ba1914fd249983641db7d5e3db21242bd8b6f53238"} Apr 16 16:01:09.739288 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:09.739270 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:01:10.729996 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.729970 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:01:10.730804 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.730230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" event={"ID":"b0890f1f-a03c-4336-8b33-5daaa65706b0","Type":"ContainerStarted","Data":"696115a9efe1d26b85e7d2955317a1536915c2408bb17c5643d6f20eb50729cd"} Apr 16 16:01:10.730804 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.730552 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:01:10.730804 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.730796 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:01:10.745278 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.745258 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:01:10.763333 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.763300 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" podStartSLOduration=9.114255355 podStartE2EDuration="26.763290462s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:00:45.841133947 +0000 UTC m=+1.961791883" lastFinishedPulling="2026-04-16 16:01:03.490169073 +0000 UTC m=+19.610826990" observedRunningTime="2026-04-16 16:01:10.761832178 +0000 UTC m=+26.882490117" watchObservedRunningTime="2026-04-16 16:01:10.763290462 +0000 UTC m=+26.883948401" Apr 16 16:01:10.807596 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.807546 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nwbqd"] Apr 16 16:01:10.807722 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.807680 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:10.807781 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:10.807764 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:10.810085 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.810062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lc585"] Apr 16 16:01:10.810192 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:10.810152 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:10.810252 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:10.810217 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:11.733989 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:11.733795 2577 generic.go:358] "Generic (PLEG): container finished" podID="803316ee-89ca-4413-8b50-f8e2771aae54" containerID="dda5aae5539d54c66f6349ffcae973c7a45f58bd13a058b8cb78c026bc86b8d2" exitCode=0 Apr 16 16:01:11.734380 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:11.733876 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" event={"ID":"803316ee-89ca-4413-8b50-f8e2771aae54","Type":"ContainerDied","Data":"dda5aae5539d54c66f6349ffcae973c7a45f58bd13a058b8cb78c026bc86b8d2"} Apr 16 16:01:11.734380 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:11.734132 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:01:12.545775 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:12.545704 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:12.545927 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:12.545704 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:12.545927 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:12.545829 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:12.545927 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:12.545870 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:12.738741 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:12.738717 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:01:12.899545 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:12.899519 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hwwrn"] Apr 16 16:01:12.920241 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:12.920216 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hwwrn"] Apr 16 16:01:12.920377 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:12.920363 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:12.920487 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:12.920442 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hwwrn" podUID="0af32a28-5811-4383-8faf-b6f21a0e2e4e" Apr 16 16:01:13.063742 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.063712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0af32a28-5811-4383-8faf-b6f21a0e2e4e-dbus\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.063870 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.063787 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0af32a28-5811-4383-8faf-b6f21a0e2e4e-kubelet-config\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.063870 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.063808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.165014 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.164988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0af32a28-5811-4383-8faf-b6f21a0e2e4e-kubelet-config\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.165156 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.165020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.165156 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.165048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0af32a28-5811-4383-8faf-b6f21a0e2e4e-dbus\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.165156 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.165132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0af32a28-5811-4383-8faf-b6f21a0e2e4e-kubelet-config\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.165265 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:13.165158 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:13.165265 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:13.165217 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret podName:0af32a28-5811-4383-8faf-b6f21a0e2e4e nodeName:}" failed. No retries permitted until 2026-04-16 16:01:13.665200431 +0000 UTC m=+29.785858351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret") pod "global-pull-secret-syncer-hwwrn" (UID: "0af32a28-5811-4383-8faf-b6f21a0e2e4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:13.165265 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.165257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0af32a28-5811-4383-8faf-b6f21a0e2e4e-dbus\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.669215 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.669184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.669394 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:13.669326 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:13.669394 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:13.669386 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret podName:0af32a28-5811-4383-8faf-b6f21a0e2e4e nodeName:}" failed. No retries permitted until 2026-04-16 16:01:14.669372198 +0000 UTC m=+30.790030116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret") pod "global-pull-secret-syncer-hwwrn" (UID: "0af32a28-5811-4383-8faf-b6f21a0e2e4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:13.742663 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.742634 2577 generic.go:358] "Generic (PLEG): container finished" podID="803316ee-89ca-4413-8b50-f8e2771aae54" containerID="76f5660cee60ed6e75600914656323a0ad3c262028b2f3540d5bbf210239d249" exitCode=0 Apr 16 16:01:13.743024 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.742717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:13.743024 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:13.742717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" event={"ID":"803316ee-89ca-4413-8b50-f8e2771aae54","Type":"ContainerDied","Data":"76f5660cee60ed6e75600914656323a0ad3c262028b2f3540d5bbf210239d249"} Apr 16 16:01:13.743024 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:13.742887 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hwwrn" podUID="0af32a28-5811-4383-8faf-b6f21a0e2e4e" Apr 16 16:01:14.411185 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:14.411139 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:01:14.411415 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:14.411393 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:01:14.423760 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:14.423556 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grb4s" Apr 16 16:01:14.546252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:14.546185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:14.546387 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:14.546308 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:14.546764 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:14.546743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:14.546865 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:14.546843 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:14.677400 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:14.677346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:14.677625 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:14.677516 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:14.677625 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:14.677593 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret podName:0af32a28-5811-4383-8faf-b6f21a0e2e4e nodeName:}" failed. No retries permitted until 2026-04-16 16:01:16.67757428 +0000 UTC m=+32.798232203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret") pod "global-pull-secret-syncer-hwwrn" (UID: "0af32a28-5811-4383-8faf-b6f21a0e2e4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:15.545345 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:15.545313 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:15.545827 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:15.545421 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hwwrn" podUID="0af32a28-5811-4383-8faf-b6f21a0e2e4e" Apr 16 16:01:16.545775 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.545625 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:16.545775 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.545677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:16.545775 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.545767 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwbqd" podUID="dc41db5b-cc00-4f67-be36-85fcec39776e" Apr 16 16:01:16.546474 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.545885 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lc585" podUID="1c981ccc-1389-4ae2-9023-1e308ba5dbb1" Apr 16 16:01:16.676921 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.676888 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-78.ec2.internal" event="NodeReady" Apr 16 16:01:16.677077 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.677020 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:01:16.692987 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.692965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:16.693095 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.693076 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:16.693150 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.693139 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret podName:0af32a28-5811-4383-8faf-b6f21a0e2e4e nodeName:}" failed. No retries permitted until 2026-04-16 16:01:20.693115461 +0000 UTC m=+36.813773379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret") pod "global-pull-secret-syncer-hwwrn" (UID: "0af32a28-5811-4383-8faf-b6f21a0e2e4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:16.712326 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.712302 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b5f59564f-rfdx5"] Apr 16 16:01:16.729274 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.729249 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt"] Apr 16 16:01:16.730391 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.729746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.733965 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.733839 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:01:16.734090 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.734048 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:01:16.734243 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.734226 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:01:16.734440 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.734422 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8px99\"" Apr 16 16:01:16.746166 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.746072 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:01:16.749806 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.749782 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28"] Apr 16 16:01:16.749915 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.749898 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:16.752858 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.752723 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 16:01:16.752858 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.752750 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 16:01:16.753024 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.752929 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5pfpn\"" Apr 16 16:01:16.767552 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.767533 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw"] Apr 16 16:01:16.767695 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.767679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.770371 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.770352 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 16:01:16.770513 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.770497 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9m69d\"" Apr 16 16:01:16.770571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.770528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 16:01:16.770803 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.770781 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:01:16.771148 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.771125 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 16:01:16.787586 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.787561 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd"] Apr 16 16:01:16.787709 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.787696 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw" Apr 16 16:01:16.790709 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.790690 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:01:16.790796 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.790780 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:01:16.790859 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.790842 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-jkwt7\"" Apr 16 16:01:16.793278 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.793255 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-registry-certificates\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.793353 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.793292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-bound-sa-token\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.793405 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.793356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-trusted-ca\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.793459 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.793435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.793510 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.793480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbz4f\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-kube-api-access-mbz4f\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.793547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.793512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d391ccc-e374-4094-94b3-f339069ebc92-ca-trust-extracted\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.793584 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.793551 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-image-registry-private-configuration\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.793584 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.793570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-installation-pull-secrets\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.805736 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.805689 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds"] Apr 16 16:01:16.805845 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.805814 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" Apr 16 16:01:16.808983 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.808962 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-7bz2d\"" Apr 16 16:01:16.808983 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.808977 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:01:16.808983 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.808981 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:01:16.809220 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.809055 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 16:01:16.809220 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.809193 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:01:16.825340 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.825320 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6"] Apr 16 16:01:16.825486 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.825464 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:16.828315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.828292 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 16:01:16.846192 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.846158 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gt8zg"] Apr 16 16:01:16.846367 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.846347 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:16.849110 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.849088 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 16:01:16.849338 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.849319 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 16:01:16.849494 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.849475 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 16:01:16.849610 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.849563 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 16:01:16.866011 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.865986 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w97np"] Apr 16 16:01:16.866247 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.866227 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:16.869488 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.869469 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5q2sn\"" Apr 16 16:01:16.869488 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.869480 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:01:16.869625 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.869534 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:01:16.869792 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.869776 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:01:16.888388 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888363 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b5f59564f-rfdx5"] Apr 16 16:01:16.888538 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888396 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt"] Apr 16 16:01:16.888538 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888408 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28"] Apr 16 16:01:16.888538 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888419 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd"] Apr 16 16:01:16.888538 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888426 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw"] Apr 16 16:01:16.888538 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888433 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6"] Apr 16 16:01:16.888538 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888440 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gt8zg"] Apr 16 16:01:16.888538 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888468 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds"] Apr 16 16:01:16.888538 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888475 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w97np"] Apr 16 16:01:16.888923 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.888545 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w97np" Apr 16 16:01:16.891571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.891547 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:01:16.891571 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.891553 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jr2th\"" Apr 16 16:01:16.891712 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.891571 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4bvn\" (UniqueName: \"kubernetes.io/projected/52558a16-a499-4eb9-9149-84e08a5d0414-kube-api-access-p4bvn\") pod \"managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd\" (UID: \"52558a16-a499-4eb9-9149-84e08a5d0414\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-config\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-registry-certificates\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-bound-sa-token\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-trusted-ca\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-serving-cert\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.894533 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.894545 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5f59564f-rfdx5: secret "image-registry-tls" not found Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.894601 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls podName:7d391ccc-e374-4094-94b3-f339069ebc92 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:17.394584356 +0000 UTC m=+33.515242296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls") pod "image-registry-b5f59564f-rfdx5" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92") : secret "image-registry-tls" not found Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbz4f\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-kube-api-access-mbz4f\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52558a16-a499-4eb9-9149-84e08a5d0414-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd\" (UID: \"52558a16-a499-4eb9-9149-84e08a5d0414\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d391ccc-e374-4094-94b3-f339069ebc92-ca-trust-extracted\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894746 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8q6d\" (UniqueName: \"kubernetes.io/projected/50ab2514-228d-4b71-9807-690d11b7d9b8-kube-api-access-x8q6d\") pod \"network-check-source-7b678d77c7-8hbqw\" (UID: \"50ab2514-228d-4b71-9807-690d11b7d9b8\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw" Apr 16 16:01:16.895280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/643d0a94-27da-40d4-a8e6-22ee5f06edb9-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:16.896252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvb47\" (UniqueName: \"kubernetes.io/projected/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-kube-api-access-zvb47\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.896252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-image-registry-private-configuration\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.896252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-installation-pull-secrets\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.896252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894915 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:16.896252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrbkm\" (UniqueName: \"kubernetes.io/projected/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-kube-api-access-jrbkm\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:16.896252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-registry-certificates\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.896252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.894995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-tmp\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:16.896252 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.895407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-trusted-ca\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.896750 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.896729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d391ccc-e374-4094-94b3-f339069ebc92-ca-trust-extracted\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.901254 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.901104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-installation-pull-secrets\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.901254 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.901142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-image-registry-private-configuration\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.903306 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.903288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-bound-sa-token\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.903397 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.903384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbz4f\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-kube-api-access-mbz4f\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:16.996034 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-config\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.996034 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-hub\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-tmp-dir\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-serving-cert\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996146 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996162 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-config-volume\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvtb\" (UniqueName: \"kubernetes.io/projected/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-kube-api-access-nvvtb\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhwzj\" (UniqueName: \"kubernetes.io/projected/43b389bd-40e6-4dcc-9713-4854f0caa438-kube-api-access-xhwzj\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-ca\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8q6d\" (UniqueName: \"kubernetes.io/projected/50ab2514-228d-4b71-9807-690d11b7d9b8-kube-api-access-x8q6d\") pod \"network-check-source-7b678d77c7-8hbqw\" (UID: \"50ab2514-228d-4b71-9807-690d11b7d9b8\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/43b389bd-40e6-4dcc-9713-4854f0caa438-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:16.996315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4bvn\" (UniqueName: \"kubernetes.io/projected/52558a16-a499-4eb9-9149-84e08a5d0414-kube-api-access-p4bvn\") pod \"managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd\" (UID: \"52558a16-a499-4eb9-9149-84e08a5d0414\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52558a16-a499-4eb9-9149-84e08a5d0414-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd\" (UID: \"52558a16-a499-4eb9-9149-84e08a5d0414\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvb47\" (UniqueName: \"kubernetes.io/projected/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-kube-api-access-zvb47\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/643d0a94-27da-40d4-a8e6-22ee5f06edb9-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.996590 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrbkm\" (UniqueName: \"kubernetes.io/projected/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-kube-api-access-jrbkm\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:16.996659 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert podName:643d0a94-27da-40d4-a8e6-22ee5f06edb9 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:17.496639603 +0000 UTC m=+33.617297537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6xbxt" (UID: "643d0a94-27da-40d4-a8e6-22ee5f06edb9") : secret "networking-console-plugin-cert" not found Apr 16 16:01:16.996786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-config\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.997492 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n42fb\" (UniqueName: \"kubernetes.io/projected/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-kube-api-access-n42fb\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:16.997492 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.996851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-tmp\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:16.997492 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.997156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-tmp\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:16.997637 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.997497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/643d0a94-27da-40d4-a8e6-22ee5f06edb9-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:16.998984 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.998961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-serving-cert\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:16.999634 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.999586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/52558a16-a499-4eb9-9149-84e08a5d0414-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd\" (UID: \"52558a16-a499-4eb9-9149-84e08a5d0414\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" Apr 16 16:01:16.999634 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:16.999616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:17.005786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.005762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4bvn\" (UniqueName: \"kubernetes.io/projected/52558a16-a499-4eb9-9149-84e08a5d0414-kube-api-access-p4bvn\") pod \"managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd\" (UID: \"52558a16-a499-4eb9-9149-84e08a5d0414\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" Apr 16 16:01:17.006413 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.006201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrbkm\" (UniqueName: \"kubernetes.io/projected/98c3fa31-c5ca-46a4-9301-8d7fd96e850e-kube-api-access-jrbkm\") pod \"klusterlet-addon-workmgr-7bb999597c-26jds\" (UID: \"98c3fa31-c5ca-46a4-9301-8d7fd96e850e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:17.006413 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.006208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvb47\" (UniqueName: \"kubernetes.io/projected/b37a5c3f-7b9a-4dcd-8949-35c0027725f3-kube-api-access-zvb47\") pod \"service-ca-operator-69965bb79d-8ts28\" (UID: \"b37a5c3f-7b9a-4dcd-8949-35c0027725f3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:17.006413 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.006350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8q6d\" (UniqueName: \"kubernetes.io/projected/50ab2514-228d-4b71-9807-690d11b7d9b8-kube-api-access-x8q6d\") pod \"network-check-source-7b678d77c7-8hbqw\" (UID: \"50ab2514-228d-4b71-9807-690d11b7d9b8\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw" Apr 16 16:01:17.075599 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.075541 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" Apr 16 16:01:17.096255 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.096231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw" Apr 16 16:01:17.098256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n42fb\" (UniqueName: \"kubernetes.io/projected/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-kube-api-access-n42fb\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:17.098343 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:17.098343 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-hub\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.098343 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098320 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-tmp-dir\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:17.098526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.098526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-config-volume\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:17.098526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvtb\" (UniqueName: \"kubernetes.io/projected/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-kube-api-access-nvvtb\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:17.098526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhwzj\" (UniqueName: \"kubernetes.io/projected/43b389bd-40e6-4dcc-9713-4854f0caa438-kube-api-access-xhwzj\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.098526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-ca\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.098763 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/43b389bd-40e6-4dcc-9713-4854f0caa438-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.098763 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:17.098763 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.098763 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.098704 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:01:17.098763 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.098727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-tmp-dir\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:17.098763 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.098760 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls podName:600ee1fa-9fa9-462c-88e4-75f0cb67d3e7 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:17.598739808 +0000 UTC m=+33.719397739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls") pod "dns-default-w97np" (UID: "600ee1fa-9fa9-462c-88e4-75f0cb67d3e7") : secret "dns-default-metrics-tls" not found Apr 16 16:01:17.099032 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.098891 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:01:17.099032 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.098942 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert podName:518655bd-6e85-4b6c-9e7d-11b4b7bc3edd nodeName:}" failed. No retries permitted until 2026-04-16 16:01:17.598929202 +0000 UTC m=+33.719587123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert") pod "ingress-canary-gt8zg" (UID: "518655bd-6e85-4b6c-9e7d-11b4b7bc3edd") : secret "canary-serving-cert" not found Apr 16 16:01:17.099424 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.099400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/43b389bd-40e6-4dcc-9713-4854f0caa438-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.099424 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.099413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-config-volume\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:17.101961 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.101935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-ca\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.102056 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.102037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.102998 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.102968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.104570 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.104342 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/43b389bd-40e6-4dcc-9713-4854f0caa438-hub\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.107544 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.107503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhwzj\" (UniqueName: \"kubernetes.io/projected/43b389bd-40e6-4dcc-9713-4854f0caa438-kube-api-access-xhwzj\") pod \"cluster-proxy-proxy-agent-6d78c68649-hdth6\" (UID: \"43b389bd-40e6-4dcc-9713-4854f0caa438\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.107671 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.107653 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvtb\" (UniqueName: \"kubernetes.io/projected/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-kube-api-access-nvvtb\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:17.107725 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.107684 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n42fb\" (UniqueName: \"kubernetes.io/projected/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-kube-api-access-n42fb\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:17.128495 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.125330 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" Apr 16 16:01:17.137692 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.137121 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:17.157133 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.157050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:01:17.199869 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.199077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:17.199869 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.199340 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:17.199869 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.199410 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs podName:dc41db5b-cc00-4f67-be36-85fcec39776e nodeName:}" failed. No retries permitted until 2026-04-16 16:01:49.199388636 +0000 UTC m=+65.320046557 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs") pod "network-metrics-daemon-nwbqd" (UID: "dc41db5b-cc00-4f67-be36-85fcec39776e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:17.300148 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.300120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:17.305823 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.305764 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkczl\" (UniqueName: \"kubernetes.io/projected/1c981ccc-1389-4ae2-9023-1e308ba5dbb1-kube-api-access-dkczl\") pod \"network-check-target-lc585\" (UID: \"1c981ccc-1389-4ae2-9023-1e308ba5dbb1\") " pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:17.312820 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.312624 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw"] Apr 16 16:01:17.315331 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.315311 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28"] Apr 16 16:01:17.316960 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:17.316933 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50ab2514_228d_4b71_9807_690d11b7d9b8.slice/crio-d037b504d7b3fdf226f2d87b660515d0b7e9f94bb0028ede62c3fedfbdd79614 WatchSource:0}: Error finding container d037b504d7b3fdf226f2d87b660515d0b7e9f94bb0028ede62c3fedfbdd79614: Status 404 returned error can't find the container with id d037b504d7b3fdf226f2d87b660515d0b7e9f94bb0028ede62c3fedfbdd79614 Apr 16 16:01:17.318208 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:17.318179 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37a5c3f_7b9a_4dcd_8949_35c0027725f3.slice/crio-cac37a4e1f0d900d3a7108a620b32f28a2a0c4761ba5507190fab4d8f0108769 WatchSource:0}: Error finding container cac37a4e1f0d900d3a7108a620b32f28a2a0c4761ba5507190fab4d8f0108769: Status 404 returned error can't find the container with id cac37a4e1f0d900d3a7108a620b32f28a2a0c4761ba5507190fab4d8f0108769 Apr 16 16:01:17.341082 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.341020 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd"] Apr 16 16:01:17.344003 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:17.343980 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52558a16_a499_4eb9_9149_84e08a5d0414.slice/crio-9c2cd3595365ccd1fdc707be79671daaf445977b8ef9b9737f725b5cf46cac6b WatchSource:0}: Error finding container 9c2cd3595365ccd1fdc707be79671daaf445977b8ef9b9737f725b5cf46cac6b: Status 404 returned error can't find the container with id 9c2cd3595365ccd1fdc707be79671daaf445977b8ef9b9737f725b5cf46cac6b Apr 16 16:01:17.353014 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.352988 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds"] Apr 16 16:01:17.362195 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:17.362166 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c3fa31_c5ca_46a4_9301_8d7fd96e850e.slice/crio-65a2cb785a414969bd6a0d67522e4c0f571c5057b1937126f080ba3ffa11aae3 WatchSource:0}: Error finding container 65a2cb785a414969bd6a0d67522e4c0f571c5057b1937126f080ba3ffa11aae3: Status 404 returned error can't find the container with id 65a2cb785a414969bd6a0d67522e4c0f571c5057b1937126f080ba3ffa11aae3 Apr 16 16:01:17.376167 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.376140 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6"] Apr 16 16:01:17.378761 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:17.378734 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43b389bd_40e6_4dcc_9713_4854f0caa438.slice/crio-59a1ba5f16c45fd4747dbe7a783cac671e907ca891a99cc1f0051ea383bdfd74 WatchSource:0}: Error finding container 59a1ba5f16c45fd4747dbe7a783cac671e907ca891a99cc1f0051ea383bdfd74: Status 404 returned error can't find the container with id 59a1ba5f16c45fd4747dbe7a783cac671e907ca891a99cc1f0051ea383bdfd74 Apr 16 16:01:17.401295 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.401271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:17.401539 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.401509 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:01:17.401539 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.401534 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5f59564f-rfdx5: secret "image-registry-tls" not found Apr 16 16:01:17.401679 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.401616 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls podName:7d391ccc-e374-4094-94b3-f339069ebc92 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:18.401577191 +0000 UTC m=+34.522235110 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls") pod "image-registry-b5f59564f-rfdx5" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92") : secret "image-registry-tls" not found Apr 16 16:01:17.502100 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.502065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:17.502228 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.502172 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:01:17.502290 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.502231 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert podName:643d0a94-27da-40d4-a8e6-22ee5f06edb9 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:18.502216066 +0000 UTC m=+34.622873983 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6xbxt" (UID: "643d0a94-27da-40d4-a8e6-22ee5f06edb9") : secret "networking-console-plugin-cert" not found Apr 16 16:01:17.545489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.545442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:17.548714 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.548533 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:01:17.603363 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.603286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:17.603363 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.603361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:17.603575 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.603473 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:01:17.603575 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.603548 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls podName:600ee1fa-9fa9-462c-88e4-75f0cb67d3e7 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:18.603528245 +0000 UTC m=+34.724186186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls") pod "dns-default-w97np" (UID: "600ee1fa-9fa9-462c-88e4-75f0cb67d3e7") : secret "dns-default-metrics-tls" not found Apr 16 16:01:17.603575 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.603475 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:01:17.603699 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:17.603639 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert podName:518655bd-6e85-4b6c-9e7d-11b4b7bc3edd nodeName:}" failed. No retries permitted until 2026-04-16 16:01:18.603619212 +0000 UTC m=+34.724277131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert") pod "ingress-canary-gt8zg" (UID: "518655bd-6e85-4b6c-9e7d-11b4b7bc3edd") : secret "canary-serving-cert" not found Apr 16 16:01:17.752885 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.752851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" event={"ID":"98c3fa31-c5ca-46a4-9301-8d7fd96e850e","Type":"ContainerStarted","Data":"65a2cb785a414969bd6a0d67522e4c0f571c5057b1937126f080ba3ffa11aae3"} Apr 16 16:01:17.754076 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.754041 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" event={"ID":"52558a16-a499-4eb9-9149-84e08a5d0414","Type":"ContainerStarted","Data":"9c2cd3595365ccd1fdc707be79671daaf445977b8ef9b9737f725b5cf46cac6b"} Apr 16 16:01:17.755263 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.755233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw" event={"ID":"50ab2514-228d-4b71-9807-690d11b7d9b8","Type":"ContainerStarted","Data":"d037b504d7b3fdf226f2d87b660515d0b7e9f94bb0028ede62c3fedfbdd79614"} Apr 16 16:01:17.756467 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.756429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" event={"ID":"b37a5c3f-7b9a-4dcd-8949-35c0027725f3","Type":"ContainerStarted","Data":"cac37a4e1f0d900d3a7108a620b32f28a2a0c4761ba5507190fab4d8f0108769"} Apr 16 16:01:17.757526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:17.757498 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" event={"ID":"43b389bd-40e6-4dcc-9713-4854f0caa438","Type":"ContainerStarted","Data":"59a1ba5f16c45fd4747dbe7a783cac671e907ca891a99cc1f0051ea383bdfd74"} Apr 16 16:01:18.411508 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.411461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:18.411686 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.411580 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:01:18.411686 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.411607 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5f59564f-rfdx5: secret "image-registry-tls" not found Apr 16 16:01:18.411686 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.411666 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls podName:7d391ccc-e374-4094-94b3-f339069ebc92 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:20.411650376 +0000 UTC m=+36.532308292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls") pod "image-registry-b5f59564f-rfdx5" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92") : secret "image-registry-tls" not found Apr 16 16:01:18.512717 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.512679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:18.512900 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.512843 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:01:18.512967 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.512924 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert podName:643d0a94-27da-40d4-a8e6-22ee5f06edb9 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:20.512900262 +0000 UTC m=+36.633558183 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6xbxt" (UID: "643d0a94-27da-40d4-a8e6-22ee5f06edb9") : secret "networking-console-plugin-cert" not found Apr 16 16:01:18.545722 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.545681 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:18.545878 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.545681 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:18.548657 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.548624 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:01:18.549916 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.549745 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4vs75\"" Apr 16 16:01:18.549916 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.549753 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x4n9w\"" Apr 16 16:01:18.561975 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.561953 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:18.613763 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.613729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:18.613914 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:18.613805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:18.613966 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.613915 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:01:18.614012 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.613990 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:01:18.614069 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.613999 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls podName:600ee1fa-9fa9-462c-88e4-75f0cb67d3e7 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:20.6139778 +0000 UTC m=+36.734635720 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls") pod "dns-default-w97np" (UID: "600ee1fa-9fa9-462c-88e4-75f0cb67d3e7") : secret "dns-default-metrics-tls" not found Apr 16 16:01:18.614069 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:18.614058 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert podName:518655bd-6e85-4b6c-9e7d-11b4b7bc3edd nodeName:}" failed. No retries permitted until 2026-04-16 16:01:20.614045119 +0000 UTC m=+36.734703039 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert") pod "ingress-canary-gt8zg" (UID: "518655bd-6e85-4b6c-9e7d-11b4b7bc3edd") : secret "canary-serving-cert" not found Apr 16 16:01:19.798879 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:19.798615 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lc585"] Apr 16 16:01:19.801190 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:19.801160 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c981ccc_1389_4ae2_9023_1e308ba5dbb1.slice/crio-343849207da6851a2af6bca26458740aa1b6360d8d9d8e0bfbd4ef475cb228fd WatchSource:0}: Error finding container 343849207da6851a2af6bca26458740aa1b6360d8d9d8e0bfbd4ef475cb228fd: Status 404 returned error can't find the container with id 343849207da6851a2af6bca26458740aa1b6360d8d9d8e0bfbd4ef475cb228fd Apr 16 16:01:20.429308 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.428623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:20.429308 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.428822 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:01:20.429308 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.428837 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5f59564f-rfdx5: secret "image-registry-tls" not found Apr 16 16:01:20.429308 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.428894 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls podName:7d391ccc-e374-4094-94b3-f339069ebc92 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:24.428875993 +0000 UTC m=+40.549533933 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls") pod "image-registry-b5f59564f-rfdx5" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92") : secret "image-registry-tls" not found Apr 16 16:01:20.530199 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.529694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:20.530199 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.529835 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:01:20.530199 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.529906 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert podName:643d0a94-27da-40d4-a8e6-22ee5f06edb9 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:24.529886784 +0000 UTC m=+40.650544706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6xbxt" (UID: "643d0a94-27da-40d4-a8e6-22ee5f06edb9") : secret "networking-console-plugin-cert" not found Apr 16 16:01:20.630964 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.630930 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:20.631167 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.631002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:20.631167 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.631100 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:01:20.631167 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.631119 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:01:20.631339 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.631178 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert podName:518655bd-6e85-4b6c-9e7d-11b4b7bc3edd nodeName:}" failed. No retries permitted until 2026-04-16 16:01:24.63116245 +0000 UTC m=+40.751820368 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert") pod "ingress-canary-gt8zg" (UID: "518655bd-6e85-4b6c-9e7d-11b4b7bc3edd") : secret "canary-serving-cert" not found Apr 16 16:01:20.631339 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:20.631194 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls podName:600ee1fa-9fa9-462c-88e4-75f0cb67d3e7 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:24.631187037 +0000 UTC m=+40.751844956 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls") pod "dns-default-w97np" (UID: "600ee1fa-9fa9-462c-88e4-75f0cb67d3e7") : secret "dns-default-metrics-tls" not found Apr 16 16:01:20.732222 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.731555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:20.744934 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.744903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0af32a28-5811-4383-8faf-b6f21a0e2e4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwwrn\" (UID: \"0af32a28-5811-4383-8faf-b6f21a0e2e4e\") " pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:20.772561 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.772528 2577 generic.go:358] "Generic (PLEG): container finished" podID="803316ee-89ca-4413-8b50-f8e2771aae54" containerID="e539ace1ae84f550fec10eafd4c79ff1bc5058dbd6f3d80ec94bcdc17279cf76" exitCode=0 Apr 16 16:01:20.772923 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.772857 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" event={"ID":"803316ee-89ca-4413-8b50-f8e2771aae54","Type":"ContainerDied","Data":"e539ace1ae84f550fec10eafd4c79ff1bc5058dbd6f3d80ec94bcdc17279cf76"} Apr 16 16:01:20.777833 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.777785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lc585" event={"ID":"1c981ccc-1389-4ae2-9023-1e308ba5dbb1","Type":"ContainerStarted","Data":"343849207da6851a2af6bca26458740aa1b6360d8d9d8e0bfbd4ef475cb228fd"} Apr 16 16:01:20.857308 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:20.856944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hwwrn" Apr 16 16:01:21.048259 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:21.048196 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hwwrn"] Apr 16 16:01:21.054721 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:21.054685 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af32a28_5811_4383_8faf_b6f21a0e2e4e.slice/crio-7c70de57bf3b170c0a628cafc324d0c2314af31b2cf799f22517232775c5b40a WatchSource:0}: Error finding container 7c70de57bf3b170c0a628cafc324d0c2314af31b2cf799f22517232775c5b40a: Status 404 returned error can't find the container with id 7c70de57bf3b170c0a628cafc324d0c2314af31b2cf799f22517232775c5b40a Apr 16 16:01:21.803647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:21.803607 2577 generic.go:358] "Generic (PLEG): container finished" podID="803316ee-89ca-4413-8b50-f8e2771aae54" containerID="c5e3c059751515fe5b0018cfed54ea2c723fe86a65284e7bf72f42b52474e4e2" exitCode=0 Apr 16 16:01:21.803805 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:21.803724 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" event={"ID":"803316ee-89ca-4413-8b50-f8e2771aae54","Type":"ContainerDied","Data":"c5e3c059751515fe5b0018cfed54ea2c723fe86a65284e7bf72f42b52474e4e2"} Apr 16 16:01:21.812162 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:21.812097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hwwrn" event={"ID":"0af32a28-5811-4383-8faf-b6f21a0e2e4e","Type":"ContainerStarted","Data":"7c70de57bf3b170c0a628cafc324d0c2314af31b2cf799f22517232775c5b40a"} Apr 16 16:01:24.476712 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:24.476673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:24.477111 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.476854 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:01:24.477111 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.476874 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5f59564f-rfdx5: secret "image-registry-tls" not found Apr 16 16:01:24.477111 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.476930 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls podName:7d391ccc-e374-4094-94b3-f339069ebc92 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:32.476912853 +0000 UTC m=+48.597570770 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls") pod "image-registry-b5f59564f-rfdx5" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92") : secret "image-registry-tls" not found Apr 16 16:01:24.577169 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:24.577134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:24.577321 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.577286 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:01:24.577381 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.577346 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert podName:643d0a94-27da-40d4-a8e6-22ee5f06edb9 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:32.577328655 +0000 UTC m=+48.697986586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6xbxt" (UID: "643d0a94-27da-40d4-a8e6-22ee5f06edb9") : secret "networking-console-plugin-cert" not found Apr 16 16:01:24.677733 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:24.677699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:24.677883 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:24.677788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:24.677883 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.677862 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:01:24.678001 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.677906 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:01:24.678001 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.677939 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls podName:600ee1fa-9fa9-462c-88e4-75f0cb67d3e7 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:32.677907503 +0000 UTC m=+48.798565434 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls") pod "dns-default-w97np" (UID: "600ee1fa-9fa9-462c-88e4-75f0cb67d3e7") : secret "dns-default-metrics-tls" not found Apr 16 16:01:24.678001 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:24.677961 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert podName:518655bd-6e85-4b6c-9e7d-11b4b7bc3edd nodeName:}" failed. No retries permitted until 2026-04-16 16:01:32.677950877 +0000 UTC m=+48.798608799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert") pod "ingress-canary-gt8zg" (UID: "518655bd-6e85-4b6c-9e7d-11b4b7bc3edd") : secret "canary-serving-cert" not found Apr 16 16:01:31.840126 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.840071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lc585" event={"ID":"1c981ccc-1389-4ae2-9023-1e308ba5dbb1","Type":"ContainerStarted","Data":"27f299a6f1a6d859c7b43e90eb2e97b8327bd55b88c791a95ad52096c64236b6"} Apr 16 16:01:31.840901 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.840878 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:01:31.842123 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.842078 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hwwrn" event={"ID":"0af32a28-5811-4383-8faf-b6f21a0e2e4e","Type":"ContainerStarted","Data":"f9325bcb423c1c5d638b90c6728237698ade56acff187d99c5594ba3b141e775"} Apr 16 16:01:31.845372 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.845353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" event={"ID":"52558a16-a499-4eb9-9149-84e08a5d0414","Type":"ContainerStarted","Data":"a6707aeb5cba9d19993e2c8c7baa98e8993d691ea57a0f09dfa2f26d69ba3a26"} Apr 16 16:01:31.847033 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.847014 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw" event={"ID":"50ab2514-228d-4b71-9807-690d11b7d9b8","Type":"ContainerStarted","Data":"98f8dd5fe983defbab87f7a92fe4673726a3f0ce78451392a85a44c7553cc15e"} Apr 16 16:01:31.848868 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.848818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" event={"ID":"b37a5c3f-7b9a-4dcd-8949-35c0027725f3","Type":"ContainerStarted","Data":"c71cdd7614266edc11d0847692d2224f9724b8a8664eae092b5dce276081f434"} Apr 16 16:01:31.851565 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.851525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" event={"ID":"43b389bd-40e6-4dcc-9713-4854f0caa438","Type":"ContainerStarted","Data":"888894bc82681b95f8f3329aa0bd58a67070735dce67fefb86a0037c96587e6e"} Apr 16 16:01:31.855169 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.855127 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lc585" podStartSLOduration=36.14500461 podStartE2EDuration="47.855117874s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:01:19.803187259 +0000 UTC m=+35.923845179" lastFinishedPulling="2026-04-16 16:01:31.51330051 +0000 UTC m=+47.633958443" observedRunningTime="2026-04-16 16:01:31.854988867 +0000 UTC m=+47.975646806" watchObservedRunningTime="2026-04-16 16:01:31.855117874 +0000 UTC m=+47.975775814" Apr 16 16:01:31.855275 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.855217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" event={"ID":"803316ee-89ca-4413-8b50-f8e2771aae54","Type":"ContainerStarted","Data":"cd6dc861d8b0bfba6412f8748e64ab0ab4e1e1e2396b2c647f9c88e3ff9d0979"} Apr 16 16:01:31.856959 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.856936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" event={"ID":"98c3fa31-c5ca-46a4-9301-8d7fd96e850e","Type":"ContainerStarted","Data":"2cbe8f5ea9d508247a26b4eb1cdd6ccc3515351d277b279f9d93c463c9847b66"} Apr 16 16:01:31.857257 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.857216 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:31.858869 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.858851 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" Apr 16 16:01:31.869955 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.869913 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" podStartSLOduration=32.783064709 podStartE2EDuration="46.869899353s" podCreationTimestamp="2026-04-16 16:00:45 +0000 UTC" firstStartedPulling="2026-04-16 16:01:17.321597452 +0000 UTC m=+33.442255375" lastFinishedPulling="2026-04-16 16:01:31.408432098 +0000 UTC m=+47.529090019" observedRunningTime="2026-04-16 16:01:31.86867926 +0000 UTC m=+47.989337209" watchObservedRunningTime="2026-04-16 16:01:31.869899353 +0000 UTC m=+47.990557292" Apr 16 16:01:31.883169 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.883114 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8hbqw" podStartSLOduration=31.681821892 podStartE2EDuration="45.883097571s" podCreationTimestamp="2026-04-16 16:00:46 +0000 UTC" firstStartedPulling="2026-04-16 16:01:17.321702986 +0000 UTC m=+33.442360905" lastFinishedPulling="2026-04-16 16:01:31.522978668 +0000 UTC m=+47.643636584" observedRunningTime="2026-04-16 16:01:31.881900158 +0000 UTC m=+48.002558097" watchObservedRunningTime="2026-04-16 16:01:31.883097571 +0000 UTC m=+48.003755510" Apr 16 16:01:31.904813 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.904766 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78bb4f7fc5-bkswd" podStartSLOduration=28.779067684 podStartE2EDuration="42.904750836s" podCreationTimestamp="2026-04-16 16:00:49 +0000 UTC" firstStartedPulling="2026-04-16 16:01:17.346045198 +0000 UTC m=+33.466703118" lastFinishedPulling="2026-04-16 16:01:31.471728348 +0000 UTC m=+47.592386270" observedRunningTime="2026-04-16 16:01:31.904344214 +0000 UTC m=+48.025002151" watchObservedRunningTime="2026-04-16 16:01:31.904750836 +0000 UTC m=+48.025408778" Apr 16 16:01:31.919851 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.919796 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hwwrn" podStartSLOduration=9.465389433 podStartE2EDuration="19.919780589s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:21.0593616 +0000 UTC m=+37.180019532" lastFinishedPulling="2026-04-16 16:01:31.513752767 +0000 UTC m=+47.634410688" observedRunningTime="2026-04-16 16:01:31.918186538 +0000 UTC m=+48.038844478" watchObservedRunningTime="2026-04-16 16:01:31.919780589 +0000 UTC m=+48.040438530" Apr 16 16:01:31.934195 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.934158 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bb999597c-26jds" podStartSLOduration=28.784639966 podStartE2EDuration="42.934146354s" podCreationTimestamp="2026-04-16 16:00:49 +0000 UTC" firstStartedPulling="2026-04-16 16:01:17.364154174 +0000 UTC m=+33.484812097" lastFinishedPulling="2026-04-16 16:01:31.513660554 +0000 UTC m=+47.634318485" observedRunningTime="2026-04-16 16:01:31.932794254 +0000 UTC m=+48.053452193" watchObservedRunningTime="2026-04-16 16:01:31.934146354 +0000 UTC m=+48.054804293" Apr 16 16:01:31.956220 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:31.956134 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qmfsc" podStartSLOduration=13.621225328 podStartE2EDuration="47.956122437s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:00:45.796236637 +0000 UTC m=+1.916894556" lastFinishedPulling="2026-04-16 16:01:20.131133748 +0000 UTC m=+36.251791665" observedRunningTime="2026-04-16 16:01:31.95574602 +0000 UTC m=+48.076403962" watchObservedRunningTime="2026-04-16 16:01:31.956122437 +0000 UTC m=+48.076780367" Apr 16 16:01:32.551883 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:32.551852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:32.552090 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.551967 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:01:32.552090 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.551977 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5f59564f-rfdx5: secret "image-registry-tls" not found Apr 16 16:01:32.552090 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.552022 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls podName:7d391ccc-e374-4094-94b3-f339069ebc92 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:48.552009063 +0000 UTC m=+64.672666980 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls") pod "image-registry-b5f59564f-rfdx5" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92") : secret "image-registry-tls" not found Apr 16 16:01:32.652546 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:32.652507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:32.652749 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.652726 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:01:32.652868 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.652856 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert podName:643d0a94-27da-40d4-a8e6-22ee5f06edb9 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:48.652784648 +0000 UTC m=+64.773442564 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6xbxt" (UID: "643d0a94-27da-40d4-a8e6-22ee5f06edb9") : secret "networking-console-plugin-cert" not found Apr 16 16:01:32.753030 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:32.752998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:32.753190 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:32.753080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:32.753190 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.753152 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:01:32.753305 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.753212 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls podName:600ee1fa-9fa9-462c-88e4-75f0cb67d3e7 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:48.753196428 +0000 UTC m=+64.873854345 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls") pod "dns-default-w97np" (UID: "600ee1fa-9fa9-462c-88e4-75f0cb67d3e7") : secret "dns-default-metrics-tls" not found Apr 16 16:01:32.753305 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.753225 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:01:32.753305 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:01:32.753276 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert podName:518655bd-6e85-4b6c-9e7d-11b4b7bc3edd nodeName:}" failed. No retries permitted until 2026-04-16 16:01:48.753259584 +0000 UTC m=+64.873917505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert") pod "ingress-canary-gt8zg" (UID: "518655bd-6e85-4b6c-9e7d-11b4b7bc3edd") : secret "canary-serving-cert" not found Apr 16 16:01:34.868948 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:34.868921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" event={"ID":"43b389bd-40e6-4dcc-9713-4854f0caa438","Type":"ContainerStarted","Data":"43c9dfee800545ae73ba5b7d1c41d9fa9486bcd8de20fd12dbba51a645e82528"} Apr 16 16:01:35.873578 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:35.873538 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" event={"ID":"43b389bd-40e6-4dcc-9713-4854f0caa438","Type":"ContainerStarted","Data":"f0588545c4db0238de41e5e638a4b466bc7c75e8725fd55de6db2b23f833d277"} Apr 16 16:01:35.893963 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:35.893915 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" podStartSLOduration=29.54410119 podStartE2EDuration="46.893902252s" podCreationTimestamp="2026-04-16 16:00:49 +0000 UTC" firstStartedPulling="2026-04-16 16:01:17.380836742 +0000 UTC m=+33.501494658" lastFinishedPulling="2026-04-16 16:01:34.730637801 +0000 UTC m=+50.851295720" observedRunningTime="2026-04-16 16:01:35.892989935 +0000 UTC m=+52.013647895" watchObservedRunningTime="2026-04-16 16:01:35.893902252 +0000 UTC m=+52.014560191" Apr 16 16:01:36.005505 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.005471 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-hr8kt"] Apr 16 16:01:36.022623 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.022596 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-hr8kt"] Apr 16 16:01:36.022780 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.022710 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.025551 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.025525 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 16:01:36.025741 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.025597 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 16:01:36.025863 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.025747 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 16:01:36.025863 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.025620 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-pr7km\"" Apr 16 16:01:36.025863 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.025621 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 16:01:36.079412 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.079385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k224s\" (UniqueName: \"kubernetes.io/projected/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-kube-api-access-k224s\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.079587 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.079472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-signing-cabundle\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.079587 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.079535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-signing-key\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.180017 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.179949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k224s\" (UniqueName: \"kubernetes.io/projected/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-kube-api-access-k224s\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.180134 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.180021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-signing-cabundle\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.180134 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.180053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-signing-key\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.180798 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.180780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-signing-cabundle\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.183003 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.182980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-signing-key\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.189051 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.189031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k224s\" (UniqueName: \"kubernetes.io/projected/ca3df19b-74b5-4a74-92b5-4e9edb36ecf5-kube-api-access-k224s\") pod \"service-ca-bfc587fb7-hr8kt\" (UID: \"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5\") " pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.332091 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.332063 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" Apr 16 16:01:36.452997 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.452912 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-hr8kt"] Apr 16 16:01:36.455707 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:36.455673 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca3df19b_74b5_4a74_92b5_4e9edb36ecf5.slice/crio-8b0ed2d2120346466c449b391e347f0b9277e895cf4072affc94d4f7467d905c WatchSource:0}: Error finding container 8b0ed2d2120346466c449b391e347f0b9277e895cf4072affc94d4f7467d905c: Status 404 returned error can't find the container with id 8b0ed2d2120346466c449b391e347f0b9277e895cf4072affc94d4f7467d905c Apr 16 16:01:36.877622 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.877587 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" event={"ID":"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5","Type":"ContainerStarted","Data":"871d48caf8ce7b6ee4fb845d38429aa301a51c80feddf29a446dbdb63b36bd53"} Apr 16 16:01:36.877622 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.877623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" event={"ID":"ca3df19b-74b5-4a74-92b5-4e9edb36ecf5","Type":"ContainerStarted","Data":"8b0ed2d2120346466c449b391e347f0b9277e895cf4072affc94d4f7467d905c"} Apr 16 16:01:36.897120 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:36.897075 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-hr8kt" podStartSLOduration=1.8970598220000001 podStartE2EDuration="1.897059822s" podCreationTimestamp="2026-04-16 16:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:36.896369282 +0000 UTC m=+53.017027220" watchObservedRunningTime="2026-04-16 16:01:36.897059822 +0000 UTC m=+53.017717805" Apr 16 16:01:48.578732 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.578701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:48.581121 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.581097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"image-registry-b5f59564f-rfdx5\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:48.679849 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.679820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:48.682163 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.682136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/643d0a94-27da-40d4-a8e6-22ee5f06edb9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6xbxt\" (UID: \"643d0a94-27da-40d4-a8e6-22ee5f06edb9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:48.784856 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.781660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:48.784856 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.781793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:48.785087 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.784906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/600ee1fa-9fa9-462c-88e4-75f0cb67d3e7-metrics-tls\") pod \"dns-default-w97np\" (UID: \"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7\") " pod="openshift-dns/dns-default-w97np" Apr 16 16:01:48.787425 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.787402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518655bd-6e85-4b6c-9e7d-11b4b7bc3edd-cert\") pod \"ingress-canary-gt8zg\" (UID: \"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd\") " pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:48.853237 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.853155 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8px99\"" Apr 16 16:01:48.860896 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.860873 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:48.862345 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.862329 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5pfpn\"" Apr 16 16:01:48.870596 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.870572 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" Apr 16 16:01:48.984259 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.984228 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5q2sn\"" Apr 16 16:01:48.992069 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.992046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gt8zg" Apr 16 16:01:48.992390 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:48.992365 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b5f59564f-rfdx5"] Apr 16 16:01:48.997073 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:48.997047 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d391ccc_e374_4094_94b3_f339069ebc92.slice/crio-14b9396264f2c0b0e451ab6311771308eacd54f1a904c22bb79b3d3de6ac9f22 WatchSource:0}: Error finding container 14b9396264f2c0b0e451ab6311771308eacd54f1a904c22bb79b3d3de6ac9f22: Status 404 returned error can't find the container with id 14b9396264f2c0b0e451ab6311771308eacd54f1a904c22bb79b3d3de6ac9f22 Apr 16 16:01:49.000488 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.000461 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jr2th\"" Apr 16 16:01:49.003999 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.003978 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt"] Apr 16 16:01:49.006573 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:49.006553 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod643d0a94_27da_40d4_a8e6_22ee5f06edb9.slice/crio-726dabb61cd0f836d77d187548ea082df2ef0be9a5192d484635729ad0592d80 WatchSource:0}: Error finding container 726dabb61cd0f836d77d187548ea082df2ef0be9a5192d484635729ad0592d80: Status 404 returned error can't find the container with id 726dabb61cd0f836d77d187548ea082df2ef0be9a5192d484635729ad0592d80 Apr 16 16:01:49.008142 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.008122 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w97np" Apr 16 16:01:49.123736 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.123691 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gt8zg"] Apr 16 16:01:49.125621 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:49.125589 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod518655bd_6e85_4b6c_9e7d_11b4b7bc3edd.slice/crio-8d05dfacd9ce14fdf7d2f7a575421419677e9bddf0e619e220f3a86a8f62215f WatchSource:0}: Error finding container 8d05dfacd9ce14fdf7d2f7a575421419677e9bddf0e619e220f3a86a8f62215f: Status 404 returned error can't find the container with id 8d05dfacd9ce14fdf7d2f7a575421419677e9bddf0e619e220f3a86a8f62215f Apr 16 16:01:49.140226 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.140203 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w97np"] Apr 16 16:01:49.143300 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:49.143275 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod600ee1fa_9fa9_462c_88e4_75f0cb67d3e7.slice/crio-85637f77e61cb7942db7e792b55220001c42466dc0030f5db54e8b4c790a1151 WatchSource:0}: Error finding container 85637f77e61cb7942db7e792b55220001c42466dc0030f5db54e8b4c790a1151: Status 404 returned error can't find the container with id 85637f77e61cb7942db7e792b55220001c42466dc0030f5db54e8b4c790a1151 Apr 16 16:01:49.284842 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.284793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:49.287829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.287803 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:01:49.297425 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.297403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc41db5b-cc00-4f67-be36-85fcec39776e-metrics-certs\") pod \"network-metrics-daemon-nwbqd\" (UID: \"dc41db5b-cc00-4f67-be36-85fcec39776e\") " pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:49.470396 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.470324 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x4n9w\"" Apr 16 16:01:49.478376 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.478348 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwbqd" Apr 16 16:01:49.607839 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.607804 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nwbqd"] Apr 16 16:01:49.615039 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:49.615004 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc41db5b_cc00_4f67_be36_85fcec39776e.slice/crio-0884b52e16a652ddd8480638f1c4b54c6bc929beabf96cb27082a876053a6d2c WatchSource:0}: Error finding container 0884b52e16a652ddd8480638f1c4b54c6bc929beabf96cb27082a876053a6d2c: Status 404 returned error can't find the container with id 0884b52e16a652ddd8480638f1c4b54c6bc929beabf96cb27082a876053a6d2c Apr 16 16:01:49.912753 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.912709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" event={"ID":"7d391ccc-e374-4094-94b3-f339069ebc92","Type":"ContainerStarted","Data":"c84279afb1f318049819218fb8c2db7b85b5e9669591db2a1500637718737019"} Apr 16 16:01:49.912930 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.912763 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" event={"ID":"7d391ccc-e374-4094-94b3-f339069ebc92","Type":"ContainerStarted","Data":"14b9396264f2c0b0e451ab6311771308eacd54f1a904c22bb79b3d3de6ac9f22"} Apr 16 16:01:49.913009 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.912975 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:01:49.914587 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.914554 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nwbqd" event={"ID":"dc41db5b-cc00-4f67-be36-85fcec39776e","Type":"ContainerStarted","Data":"0884b52e16a652ddd8480638f1c4b54c6bc929beabf96cb27082a876053a6d2c"} Apr 16 16:01:49.915664 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.915639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gt8zg" event={"ID":"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd","Type":"ContainerStarted","Data":"8d05dfacd9ce14fdf7d2f7a575421419677e9bddf0e619e220f3a86a8f62215f"} Apr 16 16:01:49.916765 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.916734 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" event={"ID":"643d0a94-27da-40d4-a8e6-22ee5f06edb9","Type":"ContainerStarted","Data":"726dabb61cd0f836d77d187548ea082df2ef0be9a5192d484635729ad0592d80"} Apr 16 16:01:49.918570 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.918541 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w97np" event={"ID":"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7","Type":"ContainerStarted","Data":"85637f77e61cb7942db7e792b55220001c42466dc0030f5db54e8b4c790a1151"} Apr 16 16:01:49.936489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:49.935600 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" podStartSLOduration=64.935589119 podStartE2EDuration="1m4.935589119s" podCreationTimestamp="2026-04-16 16:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:49.934110032 +0000 UTC m=+66.054767969" watchObservedRunningTime="2026-04-16 16:01:49.935589119 +0000 UTC m=+66.056247057" Apr 16 16:01:52.929568 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.929523 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nwbqd" event={"ID":"dc41db5b-cc00-4f67-be36-85fcec39776e","Type":"ContainerStarted","Data":"463801663e5063e75d6709755f54e9ce7c883fd9b7b2f765218abb8415822171"} Apr 16 16:01:52.929568 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.929568 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nwbqd" event={"ID":"dc41db5b-cc00-4f67-be36-85fcec39776e","Type":"ContainerStarted","Data":"33180bdd51dd23d7d5acc1ab7a4d8f425f5b7081dc81ee0b883f70c3cebd9f2f"} Apr 16 16:01:52.930908 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.930883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gt8zg" event={"ID":"518655bd-6e85-4b6c-9e7d-11b4b7bc3edd","Type":"ContainerStarted","Data":"a9254f3def4a2cb827e862507cfc9407d9e72345ffd28be4ad1444a15f2d08f5"} Apr 16 16:01:52.932169 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.932150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" event={"ID":"643d0a94-27da-40d4-a8e6-22ee5f06edb9","Type":"ContainerStarted","Data":"04a243fc7f983172b063faad21959cb616c6453926cbf678c223c9142996c7a7"} Apr 16 16:01:52.933615 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.933588 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w97np" event={"ID":"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7","Type":"ContainerStarted","Data":"6a6ab6e0aded93023b9ee8f1c49056175ea9f8f27f2da89b5e570c976c2eba20"} Apr 16 16:01:52.933615 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.933610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w97np" event={"ID":"600ee1fa-9fa9-462c-88e4-75f0cb67d3e7","Type":"ContainerStarted","Data":"9f3fc630f7bd9a3dcaf684ecbf63223b837b051bfaf06c4c856907a49af4c722"} Apr 16 16:01:52.933789 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.933767 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w97np" Apr 16 16:01:52.945486 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.945293 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nwbqd" podStartSLOduration=66.384329378 podStartE2EDuration="1m8.945282209s" podCreationTimestamp="2026-04-16 16:00:44 +0000 UTC" firstStartedPulling="2026-04-16 16:01:49.61767849 +0000 UTC m=+65.738336422" lastFinishedPulling="2026-04-16 16:01:52.178631325 +0000 UTC m=+68.299289253" observedRunningTime="2026-04-16 16:01:52.94491574 +0000 UTC m=+69.065573681" watchObservedRunningTime="2026-04-16 16:01:52.945282209 +0000 UTC m=+69.065940148" Apr 16 16:01:52.960249 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.960209 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w97np" podStartSLOduration=34.434417626 podStartE2EDuration="36.960199084s" podCreationTimestamp="2026-04-16 16:01:16 +0000 UTC" firstStartedPulling="2026-04-16 16:01:49.14511572 +0000 UTC m=+65.265773637" lastFinishedPulling="2026-04-16 16:01:51.670897165 +0000 UTC m=+67.791555095" observedRunningTime="2026-04-16 16:01:52.959632925 +0000 UTC m=+69.080290864" watchObservedRunningTime="2026-04-16 16:01:52.960199084 +0000 UTC m=+69.080857023" Apr 16 16:01:52.974362 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.974319 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6xbxt" podStartSLOduration=37.312168825 podStartE2EDuration="39.974309044s" podCreationTimestamp="2026-04-16 16:01:13 +0000 UTC" firstStartedPulling="2026-04-16 16:01:49.008762733 +0000 UTC m=+65.129420655" lastFinishedPulling="2026-04-16 16:01:51.670902954 +0000 UTC m=+67.791560874" observedRunningTime="2026-04-16 16:01:52.973572827 +0000 UTC m=+69.094230766" watchObservedRunningTime="2026-04-16 16:01:52.974309044 +0000 UTC m=+69.094966983" Apr 16 16:01:52.993046 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:52.993004 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gt8zg" podStartSLOduration=34.445074409 podStartE2EDuration="36.992991626s" podCreationTimestamp="2026-04-16 16:01:16 +0000 UTC" firstStartedPulling="2026-04-16 16:01:49.127658763 +0000 UTC m=+65.248316682" lastFinishedPulling="2026-04-16 16:01:51.675575978 +0000 UTC m=+67.796233899" observedRunningTime="2026-04-16 16:01:52.992231983 +0000 UTC m=+69.112889922" watchObservedRunningTime="2026-04-16 16:01:52.992991626 +0000 UTC m=+69.113649586" Apr 16 16:01:57.086720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.086682 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6hkg6"] Apr 16 16:01:57.120098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.120067 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6hkg6"] Apr 16 16:01:57.120098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.120093 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.123125 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.123099 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 16:01:57.124276 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.124247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-d7jg4\"" Apr 16 16:01:57.124395 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.124292 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 16:01:57.124395 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.124249 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:01:57.124617 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.124599 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 16:01:57.143606 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.143510 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 16:01:57.222536 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.222501 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2"] Apr 16 16:01:57.241622 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.241594 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" Apr 16 16:01:57.244022 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.243995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4e2a6f2-a469-4d6a-92f6-19e95df574de-serving-cert\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.244131 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.244034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7grh\" (UniqueName: \"kubernetes.io/projected/c4e2a6f2-a469-4d6a-92f6-19e95df574de-kube-api-access-p7grh\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.244131 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.244073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e2a6f2-a469-4d6a-92f6-19e95df574de-config\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.244131 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.244111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4e2a6f2-a469-4d6a-92f6-19e95df574de-trusted-ca\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.245153 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.245139 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 16:01:57.245347 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.245334 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 16:01:57.245974 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.245960 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:01:57.249697 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.249679 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-sb8tz\"" Apr 16 16:01:57.254742 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.254723 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2"] Apr 16 16:01:57.266404 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.266378 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-t2wfh"] Apr 16 16:01:57.286472 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.286429 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d"] Apr 16 16:01:57.286583 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.286564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.289168 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.289143 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:01:57.289700 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.289679 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:01:57.290083 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.290060 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pdp9s\"" Apr 16 16:01:57.290162 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.290147 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 16:01:57.290379 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.290346 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 16:01:57.294353 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.294332 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 16:01:57.307114 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.307096 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-76688fbdc6-w7hxk"] Apr 16 16:01:57.307255 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.307240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d" Apr 16 16:01:57.315022 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.314983 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-6j465\"" Apr 16 16:01:57.315140 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.315062 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 16:01:57.315320 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.315226 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:01:57.326897 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.326878 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-t2wfh"] Apr 16 16:01:57.326997 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.326911 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-76688fbdc6-w7hxk"] Apr 16 16:01:57.326997 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.326926 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d"] Apr 16 16:01:57.326997 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.326928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.326997 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.326945 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf"] Apr 16 16:01:57.329820 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.329802 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:01:57.329958 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.329942 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kx6jl\"" Apr 16 16:01:57.330096 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.330081 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 16:01:57.330302 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.330288 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 16:01:57.330625 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.330609 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:01:57.330694 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.330667 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 16:01:57.330804 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.330781 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 16:01:57.345472 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4e2a6f2-a469-4d6a-92f6-19e95df574de-serving-cert\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.345472 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7grh\" (UniqueName: \"kubernetes.io/projected/c4e2a6f2-a469-4d6a-92f6-19e95df574de-kube-api-access-p7grh\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.345641 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c8c970dd-50f2-41a4-824e-2a03a125acfd-snapshots\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.345689 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/19275ccb-4724-4f17-8bca-e669bfafe552-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wcrv2\" (UID: \"19275ccb-4724-4f17-8bca-e669bfafe552\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" Apr 16 16:01:57.345759 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtz48\" (UniqueName: \"kubernetes.io/projected/19275ccb-4724-4f17-8bca-e669bfafe552-kube-api-access-wtz48\") pod \"cluster-samples-operator-667775844f-wcrv2\" (UID: \"19275ccb-4724-4f17-8bca-e669bfafe552\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" Apr 16 16:01:57.345822 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e2a6f2-a469-4d6a-92f6-19e95df574de-config\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.345875 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4e2a6f2-a469-4d6a-92f6-19e95df574de-trusted-ca\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.345875 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8c970dd-50f2-41a4-824e-2a03a125acfd-tmp\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.345979 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c970dd-50f2-41a4-824e-2a03a125acfd-serving-cert\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.345979 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c970dd-50f2-41a4-824e-2a03a125acfd-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.345979 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c970dd-50f2-41a4-824e-2a03a125acfd-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.345979 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.345970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc64g\" (UniqueName: \"kubernetes.io/projected/c8c970dd-50f2-41a4-824e-2a03a125acfd-kube-api-access-hc64g\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.346618 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.346601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4e2a6f2-a469-4d6a-92f6-19e95df574de-trusted-ca\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.346693 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.346674 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e2a6f2-a469-4d6a-92f6-19e95df574de-config\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.346809 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.346796 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc"] Apr 16 16:01:57.346962 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.346947 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.348268 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.348237 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4e2a6f2-a469-4d6a-92f6-19e95df574de-serving-cert\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.350080 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.350066 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-5sz8s\"" Apr 16 16:01:57.350264 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.350250 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 16:01:57.350396 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.350315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 16:01:57.350396 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.350343 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 16:01:57.350517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.350428 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:01:57.367333 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.367316 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf"] Apr 16 16:01:57.367411 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.367337 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc"] Apr 16 16:01:57.367469 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.367411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.375161 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.375141 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7grh\" (UniqueName: \"kubernetes.io/projected/c4e2a6f2-a469-4d6a-92f6-19e95df574de-kube-api-access-p7grh\") pod \"console-operator-d87b8d5fc-6hkg6\" (UID: \"c4e2a6f2-a469-4d6a-92f6-19e95df574de\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.375282 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.375265 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:01:57.375390 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.375374 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 16:01:57.375492 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.375416 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 16:01:57.375572 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.375516 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:01:57.375572 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.375524 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-f7gfb\"" Apr 16 16:01:57.429982 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.429960 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:01:57.447219 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8c970dd-50f2-41a4-824e-2a03a125acfd-tmp\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.447326 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c970dd-50f2-41a4-824e-2a03a125acfd-serving-cert\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.447326 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-default-certificate\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.447438 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d4ab93-1d05-421c-a438-24378cc5b1b4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.447438 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c970dd-50f2-41a4-824e-2a03a125acfd-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.447569 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c970dd-50f2-41a4-824e-2a03a125acfd-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.447569 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh5px\" (UniqueName: \"kubernetes.io/projected/16874dd3-186f-4feb-bc62-e0eae6ce6072-kube-api-access-kh5px\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.447569 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4jdr\" (UniqueName: \"kubernetes.io/projected/5f55b121-017a-43b5-97b3-540ef3a398d6-kube-api-access-d4jdr\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.447713 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc64g\" (UniqueName: \"kubernetes.io/projected/c8c970dd-50f2-41a4-824e-2a03a125acfd-kube-api-access-hc64g\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.447713 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfqsp\" (UniqueName: \"kubernetes.io/projected/ad8ff5ed-0304-45fd-87fb-809604b2cc69-kube-api-access-kfqsp\") pod \"volume-data-source-validator-7d955d5dd4-bxn9d\" (UID: \"ad8ff5ed-0304-45fd-87fb-809604b2cc69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d" Apr 16 16:01:57.447713 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16874dd3-186f-4feb-bc62-e0eae6ce6072-service-ca-bundle\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.447713 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8c970dd-50f2-41a4-824e-2a03a125acfd-tmp\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.447885 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5f55b121-017a-43b5-97b3-540ef3a398d6-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.447885 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-metrics-certs\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.447885 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-stats-auth\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.447885 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c8c970dd-50f2-41a4-824e-2a03a125acfd-snapshots\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.447885 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f55b121-017a-43b5-97b3-540ef3a398d6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.447885 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447866 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/19275ccb-4724-4f17-8bca-e669bfafe552-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wcrv2\" (UID: \"19275ccb-4724-4f17-8bca-e669bfafe552\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" Apr 16 16:01:57.448191 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wd8d\" (UniqueName: \"kubernetes.io/projected/30d4ab93-1d05-421c-a438-24378cc5b1b4-kube-api-access-2wd8d\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.448191 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447930 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtz48\" (UniqueName: \"kubernetes.io/projected/19275ccb-4724-4f17-8bca-e669bfafe552-kube-api-access-wtz48\") pod \"cluster-samples-operator-667775844f-wcrv2\" (UID: \"19275ccb-4724-4f17-8bca-e669bfafe552\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" Apr 16 16:01:57.448191 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.447968 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d4ab93-1d05-421c-a438-24378cc5b1b4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.448400 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.448382 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c8c970dd-50f2-41a4-824e-2a03a125acfd-snapshots\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.448437 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.448397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c970dd-50f2-41a4-824e-2a03a125acfd-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.448549 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.448530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c970dd-50f2-41a4-824e-2a03a125acfd-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.450008 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.449983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c970dd-50f2-41a4-824e-2a03a125acfd-serving-cert\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.451406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.451386 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/19275ccb-4724-4f17-8bca-e669bfafe552-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-wcrv2\" (UID: \"19275ccb-4724-4f17-8bca-e669bfafe552\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" Apr 16 16:01:57.461285 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.461266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc64g\" (UniqueName: \"kubernetes.io/projected/c8c970dd-50f2-41a4-824e-2a03a125acfd-kube-api-access-hc64g\") pod \"insights-operator-5785d4fcdd-t2wfh\" (UID: \"c8c970dd-50f2-41a4-824e-2a03a125acfd\") " pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.466022 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.466002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtz48\" (UniqueName: \"kubernetes.io/projected/19275ccb-4724-4f17-8bca-e669bfafe552-kube-api-access-wtz48\") pod \"cluster-samples-operator-667775844f-wcrv2\" (UID: \"19275ccb-4724-4f17-8bca-e669bfafe552\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" Apr 16 16:01:57.549315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-default-certificate\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.549315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549320 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d4ab93-1d05-421c-a438-24378cc5b1b4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.549547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kh5px\" (UniqueName: \"kubernetes.io/projected/16874dd3-186f-4feb-bc62-e0eae6ce6072-kube-api-access-kh5px\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.549547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4jdr\" (UniqueName: \"kubernetes.io/projected/5f55b121-017a-43b5-97b3-540ef3a398d6-kube-api-access-d4jdr\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.549547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfqsp\" (UniqueName: \"kubernetes.io/projected/ad8ff5ed-0304-45fd-87fb-809604b2cc69-kube-api-access-kfqsp\") pod \"volume-data-source-validator-7d955d5dd4-bxn9d\" (UID: \"ad8ff5ed-0304-45fd-87fb-809604b2cc69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d" Apr 16 16:01:57.549547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16874dd3-186f-4feb-bc62-e0eae6ce6072-service-ca-bundle\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.549547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5f55b121-017a-43b5-97b3-540ef3a398d6-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.549811 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-metrics-certs\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.549811 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-stats-auth\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.549811 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f55b121-017a-43b5-97b3-540ef3a398d6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.549811 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wd8d\" (UniqueName: \"kubernetes.io/projected/30d4ab93-1d05-421c-a438-24378cc5b1b4-kube-api-access-2wd8d\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.549811 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.549704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d4ab93-1d05-421c-a438-24378cc5b1b4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.550693 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.550202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16874dd3-186f-4feb-bc62-e0eae6ce6072-service-ca-bundle\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.550693 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.550260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d4ab93-1d05-421c-a438-24378cc5b1b4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.550693 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.550294 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" Apr 16 16:01:57.550693 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.550514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5f55b121-017a-43b5-97b3-540ef3a398d6-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.552508 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.552489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-metrics-certs\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.552622 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.552512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d4ab93-1d05-421c-a438-24378cc5b1b4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.552622 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.552594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-default-certificate\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.552844 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.552821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f55b121-017a-43b5-97b3-540ef3a398d6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.553018 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.553000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16874dd3-186f-4feb-bc62-e0eae6ce6072-stats-auth\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.557400 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.557287 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6hkg6"] Apr 16 16:01:57.560066 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:57.560044 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4e2a6f2_a469_4d6a_92f6_19e95df574de.slice/crio-7a93a4dc38347e8628f76e0b5c4d6e72acef087c42ddf9ea91ab87f870b6bbe8 WatchSource:0}: Error finding container 7a93a4dc38347e8628f76e0b5c4d6e72acef087c42ddf9ea91ab87f870b6bbe8: Status 404 returned error can't find the container with id 7a93a4dc38347e8628f76e0b5c4d6e72acef087c42ddf9ea91ab87f870b6bbe8 Apr 16 16:01:57.564334 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.564312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wd8d\" (UniqueName: \"kubernetes.io/projected/30d4ab93-1d05-421c-a438-24378cc5b1b4-kube-api-access-2wd8d\") pod \"kube-storage-version-migrator-operator-756bb7d76f-5szhf\" (UID: \"30d4ab93-1d05-421c-a438-24378cc5b1b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.564528 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.564499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh5px\" (UniqueName: \"kubernetes.io/projected/16874dd3-186f-4feb-bc62-e0eae6ce6072-kube-api-access-kh5px\") pod \"router-default-76688fbdc6-w7hxk\" (UID: \"16874dd3-186f-4feb-bc62-e0eae6ce6072\") " pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.565005 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.564981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfqsp\" (UniqueName: \"kubernetes.io/projected/ad8ff5ed-0304-45fd-87fb-809604b2cc69-kube-api-access-kfqsp\") pod \"volume-data-source-validator-7d955d5dd4-bxn9d\" (UID: \"ad8ff5ed-0304-45fd-87fb-809604b2cc69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d" Apr 16 16:01:57.565109 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.565048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4jdr\" (UniqueName: \"kubernetes.io/projected/5f55b121-017a-43b5-97b3-540ef3a398d6-kube-api-access-d4jdr\") pod \"cluster-monitoring-operator-6667474d89-6dhvc\" (UID: \"5f55b121-017a-43b5-97b3-540ef3a398d6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.597680 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.597594 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" Apr 16 16:01:57.617890 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.617849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d" Apr 16 16:01:57.635932 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.635828 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:57.660868 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.660830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" Apr 16 16:01:57.675816 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.675627 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" Apr 16 16:01:57.697991 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.692253 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2"] Apr 16 16:01:57.776119 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.775870 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-t2wfh"] Apr 16 16:01:57.795567 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.795535 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d"] Apr 16 16:01:57.836489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.836388 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-76688fbdc6-w7hxk"] Apr 16 16:01:57.839070 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:57.839035 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16874dd3_186f_4feb_bc62_e0eae6ce6072.slice/crio-9e2f68dff9ce27d95a7b232691c94b868fb3ab2bdfd72481b588db34b5120c09 WatchSource:0}: Error finding container 9e2f68dff9ce27d95a7b232691c94b868fb3ab2bdfd72481b588db34b5120c09: Status 404 returned error can't find the container with id 9e2f68dff9ce27d95a7b232691c94b868fb3ab2bdfd72481b588db34b5120c09 Apr 16 16:01:57.866503 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.866481 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf"] Apr 16 16:01:57.868830 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:01:57.868804 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30d4ab93_1d05_421c_a438_24378cc5b1b4.slice/crio-fd3c2dc9a7096e2f34cc05f01c9175600dd78559ca7deb0b0985b23b02978b20 WatchSource:0}: Error finding container fd3c2dc9a7096e2f34cc05f01c9175600dd78559ca7deb0b0985b23b02978b20: Status 404 returned error can't find the container with id fd3c2dc9a7096e2f34cc05f01c9175600dd78559ca7deb0b0985b23b02978b20 Apr 16 16:01:57.948706 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.948680 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" event={"ID":"c4e2a6f2-a469-4d6a-92f6-19e95df574de","Type":"ContainerStarted","Data":"7a93a4dc38347e8628f76e0b5c4d6e72acef087c42ddf9ea91ab87f870b6bbe8"} Apr 16 16:01:57.949622 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.949589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" event={"ID":"16874dd3-186f-4feb-bc62-e0eae6ce6072","Type":"ContainerStarted","Data":"9e2f68dff9ce27d95a7b232691c94b868fb3ab2bdfd72481b588db34b5120c09"} Apr 16 16:01:57.950495 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.950463 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" event={"ID":"30d4ab93-1d05-421c-a438-24378cc5b1b4","Type":"ContainerStarted","Data":"fd3c2dc9a7096e2f34cc05f01c9175600dd78559ca7deb0b0985b23b02978b20"} Apr 16 16:01:57.951601 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.951580 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d" event={"ID":"ad8ff5ed-0304-45fd-87fb-809604b2cc69","Type":"ContainerStarted","Data":"b61a2a81acb14db6fabe419a04b01018591cf20dd0e0dd2638bd45c65a2e5c9d"} Apr 16 16:01:57.952827 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.952807 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" event={"ID":"c8c970dd-50f2-41a4-824e-2a03a125acfd","Type":"ContainerStarted","Data":"e407cbe0ad3475bfc0ff58ca5cf5c54cb2fa601e0b9f15c50a340e37bc6e21d3"} Apr 16 16:01:57.953808 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:57.953789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" event={"ID":"19275ccb-4724-4f17-8bca-e669bfafe552","Type":"ContainerStarted","Data":"5db577f6d6bdc9b6493c40c93143fbd34d4517d1572a5269ce3e2b82e38ae6ce"} Apr 16 16:01:58.051692 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:58.051668 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc"] Apr 16 16:01:58.963567 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:58.962602 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" event={"ID":"16874dd3-186f-4feb-bc62-e0eae6ce6072","Type":"ContainerStarted","Data":"1d3239610b8c7d87c6196faf439e69c0ad985d7147d1bc69472b62e778884b1f"} Apr 16 16:01:58.966041 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:58.965969 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" event={"ID":"5f55b121-017a-43b5-97b3-540ef3a398d6","Type":"ContainerStarted","Data":"b943e2af2c0884380bffce04c4d3075809762e18c18e68664c52883dd7c2e72d"} Apr 16 16:01:58.981812 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:58.981756 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" podStartSLOduration=1.981737098 podStartE2EDuration="1.981737098s" podCreationTimestamp="2026-04-16 16:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:58.980834556 +0000 UTC m=+75.101492494" watchObservedRunningTime="2026-04-16 16:01:58.981737098 +0000 UTC m=+75.102395039" Apr 16 16:01:59.636814 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:59.636775 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:59.640211 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:59.639953 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:59.970237 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:59.969791 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:01:59.971266 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:01:59.971242 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-76688fbdc6-w7hxk" Apr 16 16:02:02.939524 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.939490 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w97np" Apr 16 16:02:02.981013 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.980981 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" event={"ID":"19275ccb-4724-4f17-8bca-e669bfafe552","Type":"ContainerStarted","Data":"3c8cf730e740c70d2addea47dac8b041c04bd50bb8c13b422067e980aaf1096d"} Apr 16 16:02:02.981215 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.981192 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" event={"ID":"19275ccb-4724-4f17-8bca-e669bfafe552","Type":"ContainerStarted","Data":"7e81b94d276de66ed8e871fdbf73a7a7c9c845566c890e93345923bd59cff703"} Apr 16 16:02:02.982610 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.982586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" event={"ID":"c4e2a6f2-a469-4d6a-92f6-19e95df574de","Type":"ContainerStarted","Data":"5b0af19617db9e17eaaf5eb31d0432e07fe21006286e551653b8dd67b197dc85"} Apr 16 16:02:02.982766 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.982749 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:02:02.984364 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.984341 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" event={"ID":"30d4ab93-1d05-421c-a438-24378cc5b1b4","Type":"ContainerStarted","Data":"bb9f858c3406bdfef4d8704db43ff8f64c86203c90a38627aff4cf523f4326aa"} Apr 16 16:02:02.985878 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.985854 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d" event={"ID":"ad8ff5ed-0304-45fd-87fb-809604b2cc69","Type":"ContainerStarted","Data":"9ed54bb1a3151201394c4204c33294907dd9498ee0bb3a8ebc7c6f0ef9a63c31"} Apr 16 16:02:02.987787 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.987764 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" event={"ID":"c8c970dd-50f2-41a4-824e-2a03a125acfd","Type":"ContainerStarted","Data":"1b56c9df6e51a74bc2888fe4b3293e94ae78e2063c0b0d8e0a5bbbb29dac4d5c"} Apr 16 16:02:02.988346 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.988330 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" Apr 16 16:02:02.989522 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:02.989478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" event={"ID":"5f55b121-017a-43b5-97b3-540ef3a398d6","Type":"ContainerStarted","Data":"d660e9dc63a9b30390ff1697e71baae50ab576c62f8b1a9292e670addb1c768f"} Apr 16 16:02:03.000967 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:03.000916 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-wcrv2" podStartSLOduration=1.7027122860000001 podStartE2EDuration="6.000899415s" podCreationTimestamp="2026-04-16 16:01:57 +0000 UTC" firstStartedPulling="2026-04-16 16:01:57.804041974 +0000 UTC m=+73.924699891" lastFinishedPulling="2026-04-16 16:02:02.102229084 +0000 UTC m=+78.222887020" observedRunningTime="2026-04-16 16:02:02.9985059 +0000 UTC m=+79.119163836" watchObservedRunningTime="2026-04-16 16:02:03.000899415 +0000 UTC m=+79.121557412" Apr 16 16:02:03.017254 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:03.017204 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-6hkg6" podStartSLOduration=1.476594944 podStartE2EDuration="6.01719066s" podCreationTimestamp="2026-04-16 16:01:57 +0000 UTC" firstStartedPulling="2026-04-16 16:01:57.562332272 +0000 UTC m=+73.682990200" lastFinishedPulling="2026-04-16 16:02:02.102927985 +0000 UTC m=+78.223585916" observedRunningTime="2026-04-16 16:02:03.015289485 +0000 UTC m=+79.135947417" watchObservedRunningTime="2026-04-16 16:02:03.01719066 +0000 UTC m=+79.137848603" Apr 16 16:02:03.032665 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:03.032619 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6dhvc" podStartSLOduration=1.983778075 podStartE2EDuration="6.032606545s" podCreationTimestamp="2026-04-16 16:01:57 +0000 UTC" firstStartedPulling="2026-04-16 16:01:58.059180099 +0000 UTC m=+74.179838017" lastFinishedPulling="2026-04-16 16:02:02.10800856 +0000 UTC m=+78.228666487" observedRunningTime="2026-04-16 16:02:03.031794114 +0000 UTC m=+79.152452064" watchObservedRunningTime="2026-04-16 16:02:03.032606545 +0000 UTC m=+79.153264513" Apr 16 16:02:03.050421 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:03.050381 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-5szhf" podStartSLOduration=1.8211059779999998 podStartE2EDuration="6.050368065s" podCreationTimestamp="2026-04-16 16:01:57 +0000 UTC" firstStartedPulling="2026-04-16 16:01:57.875942266 +0000 UTC m=+73.996600185" lastFinishedPulling="2026-04-16 16:02:02.105204355 +0000 UTC m=+78.225862272" observedRunningTime="2026-04-16 16:02:03.050002749 +0000 UTC m=+79.170660690" watchObservedRunningTime="2026-04-16 16:02:03.050368065 +0000 UTC m=+79.171025995" Apr 16 16:02:03.091671 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:03.091612 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-t2wfh" podStartSLOduration=1.7720577020000001 podStartE2EDuration="6.091594435s" podCreationTimestamp="2026-04-16 16:01:57 +0000 UTC" firstStartedPulling="2026-04-16 16:01:57.787390256 +0000 UTC m=+73.908048188" lastFinishedPulling="2026-04-16 16:02:02.106926986 +0000 UTC m=+78.227584921" observedRunningTime="2026-04-16 16:02:03.065713932 +0000 UTC m=+79.186371873" watchObservedRunningTime="2026-04-16 16:02:03.091594435 +0000 UTC m=+79.212252375" Apr 16 16:02:03.092376 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:03.092349 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bxn9d" podStartSLOduration=1.792179001 podStartE2EDuration="6.092342907s" podCreationTimestamp="2026-04-16 16:01:57 +0000 UTC" firstStartedPulling="2026-04-16 16:01:57.802148289 +0000 UTC m=+73.922806209" lastFinishedPulling="2026-04-16 16:02:02.102312183 +0000 UTC m=+78.222970115" observedRunningTime="2026-04-16 16:02:03.090781852 +0000 UTC m=+79.211439792" watchObservedRunningTime="2026-04-16 16:02:03.092342907 +0000 UTC m=+79.213000846" Apr 16 16:02:03.866924 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:03.866897 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lc585" Apr 16 16:02:05.436034 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:05.436000 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w97np_600ee1fa-9fa9-462c-88e4-75f0cb67d3e7/dns/0.log" Apr 16 16:02:05.617440 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:05.617411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w97np_600ee1fa-9fa9-462c-88e4-75f0cb67d3e7/kube-rbac-proxy/0.log" Apr 16 16:02:06.216087 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:06.216059 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rbq9r_3ef774fd-1deb-4665-96a2-b7fe3ffcc315/dns-node-resolver/0.log" Apr 16 16:02:06.616928 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:06.616904 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b5f59564f-rfdx5_7d391ccc-e374-4094-94b3-f339069ebc92/registry/0.log" Apr 16 16:02:06.821077 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:06.821049 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lz8kp_72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f/node-ca/0.log" Apr 16 16:02:07.417498 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:07.417439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76688fbdc6-w7hxk_16874dd3-186f-4feb-bc62-e0eae6ce6072/router/0.log" Apr 16 16:02:07.617208 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:07.617185 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gt8zg_518655bd-6e85-4b6c-9e7d-11b4b7bc3edd/serve-healthcheck-canary/0.log" Apr 16 16:02:08.617295 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:08.617262 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-5szhf_30d4ab93-1d05-421c-a438-24378cc5b1b4/kube-storage-version-migrator-operator/0.log" Apr 16 16:02:08.865547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:08.865509 2577 patch_prober.go:28] interesting pod/image-registry-b5f59564f-rfdx5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:02:08.865721 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:08.865565 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" podUID="7d391ccc-e374-4094-94b3-f339069ebc92" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:02:10.925086 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:10.925056 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:02:12.180585 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.180549 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-q585g"] Apr 16 16:02:12.184282 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.184259 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.186879 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.186852 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:02:12.186989 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.186931 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:02:12.187132 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.187117 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rqdzt\"" Apr 16 16:02:12.187217 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.187179 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:02:12.188134 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.188115 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:02:12.274898 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.274866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.275062 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.274921 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-textfile\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.275062 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.274953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-tls\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.275062 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.274979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-sys\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.275062 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.275001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdkz6\" (UniqueName: \"kubernetes.io/projected/1f6b813b-ad30-4c0d-a05b-17adb2975c08-kube-api-access-fdkz6\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.275062 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.275029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-root\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.275062 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.275055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-wtmp\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.275398 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.275093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-accelerators-collector-config\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.275398 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.275129 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f6b813b-ad30-4c0d-a05b-17adb2975c08-metrics-client-ca\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376339 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-wtmp\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-accelerators-collector-config\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f6b813b-ad30-4c0d-a05b-17adb2975c08-metrics-client-ca\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-textfile\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-tls\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-sys\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdkz6\" (UniqueName: \"kubernetes.io/projected/1f6b813b-ad30-4c0d-a05b-17adb2975c08-kube-api-access-fdkz6\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-root\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.376701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376679 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-sys\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.377232 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-root\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.377232 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.376493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-wtmp\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.377232 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:02:12.376760 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:02:12.377232 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:02:12.376845 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-tls podName:1f6b813b-ad30-4c0d-a05b-17adb2975c08 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:12.876823428 +0000 UTC m=+88.997481345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-tls") pod "node-exporter-q585g" (UID: "1f6b813b-ad30-4c0d-a05b-17adb2975c08") : secret "node-exporter-tls" not found Apr 16 16:02:12.377232 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.377135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-accelerators-collector-config\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.377569 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.377280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-textfile\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.377719 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.377658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f6b813b-ad30-4c0d-a05b-17adb2975c08-metrics-client-ca\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.379195 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.379166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.385545 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.385527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdkz6\" (UniqueName: \"kubernetes.io/projected/1f6b813b-ad30-4c0d-a05b-17adb2975c08-kube-api-access-fdkz6\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.881657 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.881628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-tls\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:12.884176 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:12.884146 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1f6b813b-ad30-4c0d-a05b-17adb2975c08-node-exporter-tls\") pod \"node-exporter-q585g\" (UID: \"1f6b813b-ad30-4c0d-a05b-17adb2975c08\") " pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:13.097020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:13.096982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-q585g" Apr 16 16:02:13.106802 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:02:13.106757 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f6b813b_ad30_4c0d_a05b_17adb2975c08.slice/crio-187145e5e61bb6d52f171a1d8b30c4b76aed4adee16542ba1a0001b42286b561 WatchSource:0}: Error finding container 187145e5e61bb6d52f171a1d8b30c4b76aed4adee16542ba1a0001b42286b561: Status 404 returned error can't find the container with id 187145e5e61bb6d52f171a1d8b30c4b76aed4adee16542ba1a0001b42286b561 Apr 16 16:02:14.022922 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:14.022854 2577 generic.go:358] "Generic (PLEG): container finished" podID="1f6b813b-ad30-4c0d-a05b-17adb2975c08" containerID="76ba97bb13d799d5c38e21f36274c3241c6c422769e206660e206e2e015282a2" exitCode=0 Apr 16 16:02:14.023245 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:14.022927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q585g" event={"ID":"1f6b813b-ad30-4c0d-a05b-17adb2975c08","Type":"ContainerDied","Data":"76ba97bb13d799d5c38e21f36274c3241c6c422769e206660e206e2e015282a2"} Apr 16 16:02:14.023245 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:14.022963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q585g" event={"ID":"1f6b813b-ad30-4c0d-a05b-17adb2975c08","Type":"ContainerStarted","Data":"187145e5e61bb6d52f171a1d8b30c4b76aed4adee16542ba1a0001b42286b561"} Apr 16 16:02:15.028210 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:15.028171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q585g" event={"ID":"1f6b813b-ad30-4c0d-a05b-17adb2975c08","Type":"ContainerStarted","Data":"52db59a35e13d37cb3f4b33d5f6b8959ae339d0104f2f053a0452c35895d0645"} Apr 16 16:02:15.028210 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:15.028211 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q585g" event={"ID":"1f6b813b-ad30-4c0d-a05b-17adb2975c08","Type":"ContainerStarted","Data":"c4ddf2f2b6c272f2f15ae5d5e25cd200da4e3ea1156d2ddcdb0114cb996007c7"} Apr 16 16:02:15.049605 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:15.049547 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-q585g" podStartSLOduration=2.393226832 podStartE2EDuration="3.049529723s" podCreationTimestamp="2026-04-16 16:02:12 +0000 UTC" firstStartedPulling="2026-04-16 16:02:13.108736993 +0000 UTC m=+89.229394913" lastFinishedPulling="2026-04-16 16:02:13.765039878 +0000 UTC m=+89.885697804" observedRunningTime="2026-04-16 16:02:15.047973703 +0000 UTC m=+91.168631654" watchObservedRunningTime="2026-04-16 16:02:15.049529723 +0000 UTC m=+91.170187674" Apr 16 16:02:19.277642 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.277606 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gjzps"] Apr 16 16:02:19.280407 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.280391 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.285857 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.285827 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:02:19.285857 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.285830 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:02:19.286032 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.285924 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-frjrw\"" Apr 16 16:02:19.289157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.289125 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gjzps"] Apr 16 16:02:19.433359 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.433320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/42c41435-4859-4c94-aa0a-bf577233ed83-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.433359 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.433357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/42c41435-4859-4c94-aa0a-bf577233ed83-crio-socket\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.433645 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.433384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/42c41435-4859-4c94-aa0a-bf577233ed83-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.433645 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.433418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/42c41435-4859-4c94-aa0a-bf577233ed83-data-volume\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.433645 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.433556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmms\" (UniqueName: \"kubernetes.io/projected/42c41435-4859-4c94-aa0a-bf577233ed83-kube-api-access-xxmms\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.534427 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.534329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/42c41435-4859-4c94-aa0a-bf577233ed83-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.534427 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.534389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/42c41435-4859-4c94-aa0a-bf577233ed83-crio-socket\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.534647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.534437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/42c41435-4859-4c94-aa0a-bf577233ed83-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.534647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.534487 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/42c41435-4859-4c94-aa0a-bf577233ed83-crio-socket\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.534647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.534507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/42c41435-4859-4c94-aa0a-bf577233ed83-data-volume\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.534647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.534613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmms\" (UniqueName: \"kubernetes.io/projected/42c41435-4859-4c94-aa0a-bf577233ed83-kube-api-access-xxmms\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.534817 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.534804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/42c41435-4859-4c94-aa0a-bf577233ed83-data-volume\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.534931 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.534914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/42c41435-4859-4c94-aa0a-bf577233ed83-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.537139 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.537111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/42c41435-4859-4c94-aa0a-bf577233ed83-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.543368 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.543349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmms\" (UniqueName: \"kubernetes.io/projected/42c41435-4859-4c94-aa0a-bf577233ed83-kube-api-access-xxmms\") pod \"insights-runtime-extractor-gjzps\" (UID: \"42c41435-4859-4c94-aa0a-bf577233ed83\") " pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.589826 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.589787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gjzps" Apr 16 16:02:19.706863 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:19.706842 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gjzps"] Apr 16 16:02:19.709429 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:02:19.709401 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c41435_4859_4c94_aa0a_bf577233ed83.slice/crio-eb415e7a4f3b47f420095a7c9838191ec861a52ecd14b0e238bb8d44ad0f5cd2 WatchSource:0}: Error finding container eb415e7a4f3b47f420095a7c9838191ec861a52ecd14b0e238bb8d44ad0f5cd2: Status 404 returned error can't find the container with id eb415e7a4f3b47f420095a7c9838191ec861a52ecd14b0e238bb8d44ad0f5cd2 Apr 16 16:02:20.042404 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:20.042365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gjzps" event={"ID":"42c41435-4859-4c94-aa0a-bf577233ed83","Type":"ContainerStarted","Data":"a91f90fa9bae7879166ec7d29822c03f557c52a8269f7d66b4659dd427083147"} Apr 16 16:02:20.042404 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:20.042404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gjzps" event={"ID":"42c41435-4859-4c94-aa0a-bf577233ed83","Type":"ContainerStarted","Data":"eb415e7a4f3b47f420095a7c9838191ec861a52ecd14b0e238bb8d44ad0f5cd2"} Apr 16 16:02:21.047056 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:21.047019 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gjzps" event={"ID":"42c41435-4859-4c94-aa0a-bf577233ed83","Type":"ContainerStarted","Data":"bf69cc4f957c24336913a3f1735182321626b17e47e0459c5f4008fcff175f56"} Apr 16 16:02:23.054647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:23.054599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gjzps" event={"ID":"42c41435-4859-4c94-aa0a-bf577233ed83","Type":"ContainerStarted","Data":"a7757cc2edb117617dd9037d7df58763ba70b717f838b48b4023ca913e146534"} Apr 16 16:02:23.076261 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:23.076217 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gjzps" podStartSLOduration=1.735797177 podStartE2EDuration="4.076205499s" podCreationTimestamp="2026-04-16 16:02:19 +0000 UTC" firstStartedPulling="2026-04-16 16:02:19.756725362 +0000 UTC m=+95.877383278" lastFinishedPulling="2026-04-16 16:02:22.097133684 +0000 UTC m=+98.217791600" observedRunningTime="2026-04-16 16:02:23.074609719 +0000 UTC m=+99.195267658" watchObservedRunningTime="2026-04-16 16:02:23.076205499 +0000 UTC m=+99.196863437" Apr 16 16:02:23.934700 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:23.934669 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b5f59564f-rfdx5"] Apr 16 16:02:48.954154 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:48.954109 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" podUID="7d391ccc-e374-4094-94b3-f339069ebc92" containerName="registry" containerID="cri-o://c84279afb1f318049819218fb8c2db7b85b5e9669591db2a1500637718737019" gracePeriod=30 Apr 16 16:02:49.128625 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.128593 2577 generic.go:358] "Generic (PLEG): container finished" podID="7d391ccc-e374-4094-94b3-f339069ebc92" containerID="c84279afb1f318049819218fb8c2db7b85b5e9669591db2a1500637718737019" exitCode=0 Apr 16 16:02:49.128757 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.128643 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" event={"ID":"7d391ccc-e374-4094-94b3-f339069ebc92","Type":"ContainerDied","Data":"c84279afb1f318049819218fb8c2db7b85b5e9669591db2a1500637718737019"} Apr 16 16:02:49.184827 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.184783 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:02:49.282805 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.282720 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-registry-certificates\") pod \"7d391ccc-e374-4094-94b3-f339069ebc92\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " Apr 16 16:02:49.282805 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.282758 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-image-registry-private-configuration\") pod \"7d391ccc-e374-4094-94b3-f339069ebc92\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " Apr 16 16:02:49.282805 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.282800 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d391ccc-e374-4094-94b3-f339069ebc92-ca-trust-extracted\") pod \"7d391ccc-e374-4094-94b3-f339069ebc92\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " Apr 16 16:02:49.283066 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.282863 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-trusted-ca\") pod \"7d391ccc-e374-4094-94b3-f339069ebc92\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " Apr 16 16:02:49.283066 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.282897 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-installation-pull-secrets\") pod \"7d391ccc-e374-4094-94b3-f339069ebc92\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " Apr 16 16:02:49.283066 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.282943 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbz4f\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-kube-api-access-mbz4f\") pod \"7d391ccc-e374-4094-94b3-f339069ebc92\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " Apr 16 16:02:49.283066 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.282980 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") pod \"7d391ccc-e374-4094-94b3-f339069ebc92\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " Apr 16 16:02:49.283066 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.283014 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-bound-sa-token\") pod \"7d391ccc-e374-4094-94b3-f339069ebc92\" (UID: \"7d391ccc-e374-4094-94b3-f339069ebc92\") " Apr 16 16:02:49.283301 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.283279 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7d391ccc-e374-4094-94b3-f339069ebc92" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:02:49.283462 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.283406 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7d391ccc-e374-4094-94b3-f339069ebc92" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:02:49.285906 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.285872 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7d391ccc-e374-4094-94b3-f339069ebc92" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:02:49.286059 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.286035 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7d391ccc-e374-4094-94b3-f339069ebc92" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:02:49.286059 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.286037 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-kube-api-access-mbz4f" (OuterVolumeSpecName: "kube-api-access-mbz4f") pod "7d391ccc-e374-4094-94b3-f339069ebc92" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92"). InnerVolumeSpecName "kube-api-access-mbz4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:02:49.286236 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.286112 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7d391ccc-e374-4094-94b3-f339069ebc92" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:02:49.286236 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.286174 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7d391ccc-e374-4094-94b3-f339069ebc92" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:02:49.293029 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.293001 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d391ccc-e374-4094-94b3-f339069ebc92-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7d391ccc-e374-4094-94b3-f339069ebc92" (UID: "7d391ccc-e374-4094-94b3-f339069ebc92"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:02:49.384081 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.384041 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-registry-certificates\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:02:49.384081 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.384074 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-image-registry-private-configuration\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:02:49.384081 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.384087 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d391ccc-e374-4094-94b3-f339069ebc92-ca-trust-extracted\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:02:49.384304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.384098 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d391ccc-e374-4094-94b3-f339069ebc92-trusted-ca\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:02:49.384304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.384120 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d391ccc-e374-4094-94b3-f339069ebc92-installation-pull-secrets\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:02:49.384304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.384129 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbz4f\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-kube-api-access-mbz4f\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:02:49.384304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.384137 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-registry-tls\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:02:49.384304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:49.384146 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d391ccc-e374-4094-94b3-f339069ebc92-bound-sa-token\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:02:50.131899 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:50.131870 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" Apr 16 16:02:50.131899 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:50.131884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b5f59564f-rfdx5" event={"ID":"7d391ccc-e374-4094-94b3-f339069ebc92","Type":"ContainerDied","Data":"14b9396264f2c0b0e451ab6311771308eacd54f1a904c22bb79b3d3de6ac9f22"} Apr 16 16:02:50.132346 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:50.131935 2577 scope.go:117] "RemoveContainer" containerID="c84279afb1f318049819218fb8c2db7b85b5e9669591db2a1500637718737019" Apr 16 16:02:50.154224 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:50.154198 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b5f59564f-rfdx5"] Apr 16 16:02:50.158373 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:50.158352 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-b5f59564f-rfdx5"] Apr 16 16:02:50.549390 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:50.549315 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d391ccc-e374-4094-94b3-f339069ebc92" path="/var/lib/kubelet/pods/7d391ccc-e374-4094-94b3-f339069ebc92/volumes" Apr 16 16:02:53.142828 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:53.142793 2577 generic.go:358] "Generic (PLEG): container finished" podID="b37a5c3f-7b9a-4dcd-8949-35c0027725f3" containerID="c71cdd7614266edc11d0847692d2224f9724b8a8664eae092b5dce276081f434" exitCode=0 Apr 16 16:02:53.143241 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:53.142862 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" event={"ID":"b37a5c3f-7b9a-4dcd-8949-35c0027725f3","Type":"ContainerDied","Data":"c71cdd7614266edc11d0847692d2224f9724b8a8664eae092b5dce276081f434"} Apr 16 16:02:53.143241 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:53.143172 2577 scope.go:117] "RemoveContainer" containerID="c71cdd7614266edc11d0847692d2224f9724b8a8664eae092b5dce276081f434" Apr 16 16:02:54.147105 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:54.147070 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-8ts28" event={"ID":"b37a5c3f-7b9a-4dcd-8949-35c0027725f3","Type":"ContainerStarted","Data":"f49665ca88bac113e5b7b5060b2b1e013382e31c01c3eb0d6f33b079c8f8b573"} Apr 16 16:02:57.158609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:02:57.158572 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" podUID="43b389bd-40e6-4dcc-9713-4854f0caa438" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:03:07.159018 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:03:07.158979 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" podUID="43b389bd-40e6-4dcc-9713-4854f0caa438" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:03:17.159052 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:03:17.159009 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" podUID="43b389bd-40e6-4dcc-9713-4854f0caa438" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:03:17.159444 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:03:17.159098 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" Apr 16 16:03:17.159626 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:03:17.159594 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"f0588545c4db0238de41e5e638a4b466bc7c75e8725fd55de6db2b23f833d277"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 16:03:17.159669 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:03:17.159648 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" podUID="43b389bd-40e6-4dcc-9713-4854f0caa438" containerName="service-proxy" containerID="cri-o://f0588545c4db0238de41e5e638a4b466bc7c75e8725fd55de6db2b23f833d277" gracePeriod=30 Apr 16 16:03:18.212287 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:03:18.212254 2577 generic.go:358] "Generic (PLEG): container finished" podID="43b389bd-40e6-4dcc-9713-4854f0caa438" containerID="f0588545c4db0238de41e5e638a4b466bc7c75e8725fd55de6db2b23f833d277" exitCode=2 Apr 16 16:03:18.212677 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:03:18.212305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" event={"ID":"43b389bd-40e6-4dcc-9713-4854f0caa438","Type":"ContainerDied","Data":"f0588545c4db0238de41e5e638a4b466bc7c75e8725fd55de6db2b23f833d277"} Apr 16 16:03:18.212677 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:03:18.212333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d78c68649-hdth6" event={"ID":"43b389bd-40e6-4dcc-9713-4854f0caa438","Type":"ContainerStarted","Data":"1e63118b6d5a82cea10390fe382676581fd89da92f70efdaaa6ecb73a9fd7156"} Apr 16 16:05:44.458264 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:05:44.458230 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:05:44.458830 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:05:44.458815 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:05:44.464963 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:05:44.464940 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:10:44.480095 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:10:44.480064 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:10:44.480651 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:10:44.480190 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:15:44.505379 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:15:44.505311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:15:44.505379 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:15:44.505320 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:16:03.806484 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.806434 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5879d548d6-777dg"] Apr 16 16:16:03.806964 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.806737 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d391ccc-e374-4094-94b3-f339069ebc92" containerName="registry" Apr 16 16:16:03.806964 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.806748 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d391ccc-e374-4094-94b3-f339069ebc92" containerName="registry" Apr 16 16:16:03.806964 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.806795 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d391ccc-e374-4094-94b3-f339069ebc92" containerName="registry" Apr 16 16:16:03.809553 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.809537 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:03.813871 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.813853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 16:16:03.815098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.815067 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 16:16:03.815098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.815094 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 16:16:03.815259 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.815117 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:16:03.815259 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.815173 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5gld5\"" Apr 16 16:16:03.815259 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.815188 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 16:16:03.830213 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.830190 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5879d548d6-777dg"] Apr 16 16:16:03.990125 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.990091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b007a35-c94c-4587-8a6c-c23126562d5d-cert\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:03.990264 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.990133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2b007a35-c94c-4587-8a6c-c23126562d5d-manager-config\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:03.990264 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.990202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jwt7\" (UniqueName: \"kubernetes.io/projected/2b007a35-c94c-4587-8a6c-c23126562d5d-kube-api-access-2jwt7\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:03.990264 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:03.990232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b007a35-c94c-4587-8a6c-c23126562d5d-metrics-cert\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.091138 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.091051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b007a35-c94c-4587-8a6c-c23126562d5d-metrics-cert\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.091138 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.091115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b007a35-c94c-4587-8a6c-c23126562d5d-cert\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.091353 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.091144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2b007a35-c94c-4587-8a6c-c23126562d5d-manager-config\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.091353 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.091172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jwt7\" (UniqueName: \"kubernetes.io/projected/2b007a35-c94c-4587-8a6c-c23126562d5d-kube-api-access-2jwt7\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.091852 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.091833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2b007a35-c94c-4587-8a6c-c23126562d5d-manager-config\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.093778 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.093756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b007a35-c94c-4587-8a6c-c23126562d5d-cert\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.093854 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.093810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b007a35-c94c-4587-8a6c-c23126562d5d-metrics-cert\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.100337 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.100315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jwt7\" (UniqueName: \"kubernetes.io/projected/2b007a35-c94c-4587-8a6c-c23126562d5d-kube-api-access-2jwt7\") pod \"lws-controller-manager-5879d548d6-777dg\" (UID: \"2b007a35-c94c-4587-8a6c-c23126562d5d\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.121142 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.121119 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:04.241781 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.241759 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5879d548d6-777dg"] Apr 16 16:16:04.244258 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:16:04.244232 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b007a35_c94c_4587_8a6c_c23126562d5d.slice/crio-63b99c6cd3e10363dbb2ba2e451581e2352a0eef6e73d0a6ad70ddce53d2f645 WatchSource:0}: Error finding container 63b99c6cd3e10363dbb2ba2e451581e2352a0eef6e73d0a6ad70ddce53d2f645: Status 404 returned error can't find the container with id 63b99c6cd3e10363dbb2ba2e451581e2352a0eef6e73d0a6ad70ddce53d2f645 Apr 16 16:16:04.245982 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.245964 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:16:04.311725 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:04.311695 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" event={"ID":"2b007a35-c94c-4587-8a6c-c23126562d5d","Type":"ContainerStarted","Data":"63b99c6cd3e10363dbb2ba2e451581e2352a0eef6e73d0a6ad70ddce53d2f645"} Apr 16 16:16:11.332635 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:11.332597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" event={"ID":"2b007a35-c94c-4587-8a6c-c23126562d5d","Type":"ContainerStarted","Data":"48236b5c729bd6d7be1ef88e6156203ffa13ea91a4fc1ca22e68352fea2d9868"} Apr 16 16:16:11.333026 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:11.332742 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:11.355574 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:11.355532 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" podStartSLOduration=1.362277703 podStartE2EDuration="8.355518794s" podCreationTimestamp="2026-04-16 16:16:03 +0000 UTC" firstStartedPulling="2026-04-16 16:16:04.246115385 +0000 UTC m=+920.366773304" lastFinishedPulling="2026-04-16 16:16:11.239356479 +0000 UTC m=+927.360014395" observedRunningTime="2026-04-16 16:16:11.353261858 +0000 UTC m=+927.473919796" watchObservedRunningTime="2026-04-16 16:16:11.355518794 +0000 UTC m=+927.476176733" Apr 16 16:16:22.340323 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:22.340294 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-777dg" Apr 16 16:16:57.442867 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.442825 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v"] Apr 16 16:16:57.451720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.451687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.453003 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.452980 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v"] Apr 16 16:16:57.454257 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.454236 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 16:16:57.454354 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.454237 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 16:16:57.454354 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.454290 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9p747\"" Apr 16 16:16:57.454354 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.454237 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 16:16:57.455389 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.455372 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 16:16:57.593756 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.593723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c020fb01-b472-49c6-9d21-a85264d58aec-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.593923 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.593767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfzpp\" (UniqueName: \"kubernetes.io/projected/c020fb01-b472-49c6-9d21-a85264d58aec-kube-api-access-qfzpp\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.593923 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.593797 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c020fb01-b472-49c6-9d21-a85264d58aec-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.694246 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.694140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c020fb01-b472-49c6-9d21-a85264d58aec-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.694424 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.694298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c020fb01-b472-49c6-9d21-a85264d58aec-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.694424 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.694387 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfzpp\" (UniqueName: \"kubernetes.io/projected/c020fb01-b472-49c6-9d21-a85264d58aec-kube-api-access-qfzpp\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.694937 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.694911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c020fb01-b472-49c6-9d21-a85264d58aec-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.697092 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.697068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c020fb01-b472-49c6-9d21-a85264d58aec-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.703371 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.703344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfzpp\" (UniqueName: \"kubernetes.io/projected/c020fb01-b472-49c6-9d21-a85264d58aec-kube-api-access-qfzpp\") pod \"kuadrant-console-plugin-6c886788f8-92v9v\" (UID: \"c020fb01-b472-49c6-9d21-a85264d58aec\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.761377 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.761354 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" Apr 16 16:16:57.877558 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:57.877530 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v"] Apr 16 16:16:57.879933 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:16:57.879904 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc020fb01_b472_49c6_9d21_a85264d58aec.slice/crio-c2cbaecad53ddf327ab39d73ca41cdeca47219d2283a099267e2ad66a208e5f5 WatchSource:0}: Error finding container c2cbaecad53ddf327ab39d73ca41cdeca47219d2283a099267e2ad66a208e5f5: Status 404 returned error can't find the container with id c2cbaecad53ddf327ab39d73ca41cdeca47219d2283a099267e2ad66a208e5f5 Apr 16 16:16:58.464116 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:16:58.464081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" event={"ID":"c020fb01-b472-49c6-9d21-a85264d58aec","Type":"ContainerStarted","Data":"c2cbaecad53ddf327ab39d73ca41cdeca47219d2283a099267e2ad66a208e5f5"} Apr 16 16:17:04.485327 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:04.485291 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" event={"ID":"c020fb01-b472-49c6-9d21-a85264d58aec","Type":"ContainerStarted","Data":"a3346c3b7a7c120f7ac7e2d0b84077ebd4bc7c78021829b09218286a78457e0b"} Apr 16 16:17:04.502391 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:04.502348 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-92v9v" podStartSLOduration=1.329969685 podStartE2EDuration="7.502334728s" podCreationTimestamp="2026-04-16 16:16:57 +0000 UTC" firstStartedPulling="2026-04-16 16:16:57.881238433 +0000 UTC m=+974.001896353" lastFinishedPulling="2026-04-16 16:17:04.053603475 +0000 UTC m=+980.174261396" observedRunningTime="2026-04-16 16:17:04.500046039 +0000 UTC m=+980.620703979" watchObservedRunningTime="2026-04-16 16:17:04.502334728 +0000 UTC m=+980.622992664" Apr 16 16:17:37.876436 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:37.876401 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-j5p5j"] Apr 16 16:17:37.884514 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:37.884485 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:37.887325 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:37.887304 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 16:17:37.890478 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:37.890438 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-j5p5j"] Apr 16 16:17:37.919750 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:37.919726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnwfq\" (UniqueName: \"kubernetes.io/projected/e542dd41-bcf6-4468-9e60-d9cbabe88534-kube-api-access-cnwfq\") pod \"limitador-limitador-67566c68b4-j5p5j\" (UID: \"e542dd41-bcf6-4468-9e60-d9cbabe88534\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:37.919876 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:37.919794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e542dd41-bcf6-4468-9e60-d9cbabe88534-config-file\") pod \"limitador-limitador-67566c68b4-j5p5j\" (UID: \"e542dd41-bcf6-4468-9e60-d9cbabe88534\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:37.922069 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:37.922045 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-j5p5j"] Apr 16 16:17:38.020494 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:38.020430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnwfq\" (UniqueName: \"kubernetes.io/projected/e542dd41-bcf6-4468-9e60-d9cbabe88534-kube-api-access-cnwfq\") pod \"limitador-limitador-67566c68b4-j5p5j\" (UID: \"e542dd41-bcf6-4468-9e60-d9cbabe88534\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:38.020701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:38.020559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e542dd41-bcf6-4468-9e60-d9cbabe88534-config-file\") pod \"limitador-limitador-67566c68b4-j5p5j\" (UID: \"e542dd41-bcf6-4468-9e60-d9cbabe88534\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:38.021147 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:38.021123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e542dd41-bcf6-4468-9e60-d9cbabe88534-config-file\") pod \"limitador-limitador-67566c68b4-j5p5j\" (UID: \"e542dd41-bcf6-4468-9e60-d9cbabe88534\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:38.030116 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:38.030086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnwfq\" (UniqueName: \"kubernetes.io/projected/e542dd41-bcf6-4468-9e60-d9cbabe88534-kube-api-access-cnwfq\") pod \"limitador-limitador-67566c68b4-j5p5j\" (UID: \"e542dd41-bcf6-4468-9e60-d9cbabe88534\") " pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:38.196846 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:38.196754 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:38.314005 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:38.313973 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-j5p5j"] Apr 16 16:17:38.317192 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:17:38.317166 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode542dd41_bcf6_4468_9e60_d9cbabe88534.slice/crio-1152c6f98d8f85c29f8910c2d6168c45ccfcb3e43cb97dd2f92cf52f8be238d2 WatchSource:0}: Error finding container 1152c6f98d8f85c29f8910c2d6168c45ccfcb3e43cb97dd2f92cf52f8be238d2: Status 404 returned error can't find the container with id 1152c6f98d8f85c29f8910c2d6168c45ccfcb3e43cb97dd2f92cf52f8be238d2 Apr 16 16:17:38.581984 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:38.581893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" event={"ID":"e542dd41-bcf6-4468-9e60-d9cbabe88534","Type":"ContainerStarted","Data":"1152c6f98d8f85c29f8910c2d6168c45ccfcb3e43cb97dd2f92cf52f8be238d2"} Apr 16 16:17:39.586120 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:39.586085 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" event={"ID":"e542dd41-bcf6-4468-9e60-d9cbabe88534","Type":"ContainerStarted","Data":"09a5fb47cdc3aaee81a83451b834bef6647ab563401db5986fe21024616fd631"} Apr 16 16:17:39.586529 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:39.586202 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:17:39.603544 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:39.603492 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" podStartSLOduration=1.492164757 podStartE2EDuration="2.603478979s" podCreationTimestamp="2026-04-16 16:17:37 +0000 UTC" firstStartedPulling="2026-04-16 16:17:38.319002147 +0000 UTC m=+1014.439660064" lastFinishedPulling="2026-04-16 16:17:39.430316369 +0000 UTC m=+1015.550974286" observedRunningTime="2026-04-16 16:17:39.601081198 +0000 UTC m=+1015.721739138" watchObservedRunningTime="2026-04-16 16:17:39.603478979 +0000 UTC m=+1015.724136921" Apr 16 16:17:50.590699 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:17:50.590670 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-j5p5j" Apr 16 16:20:27.654864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.654828 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj"] Apr 16 16:20:27.658722 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.658701 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.662810 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.662787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:20:27.662946 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.662817 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:20:27.662946 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.662825 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 16:20:27.663137 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.662944 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:20:27.668788 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.668764 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj"] Apr 16 16:20:27.700574 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.700548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p65kx\" (UniqueName: \"kubernetes.io/projected/053aca76-2395-4ccb-b967-df89fb88ff5c-kube-api-access-p65kx\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.700725 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.700588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.700725 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.700613 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/053aca76-2395-4ccb-b967-df89fb88ff5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.700725 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.700687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.700725 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.700716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.700887 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.700736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802063 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802063 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802066 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/053aca76-2395-4ccb-b967-df89fb88ff5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p65kx\" (UniqueName: \"kubernetes.io/projected/053aca76-2395-4ccb-b967-df89fb88ff5c-kube-api-access-p65kx\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802520 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802501 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802579 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.802579 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.802565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.804566 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.804543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.804741 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.804726 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/053aca76-2395-4ccb-b967-df89fb88ff5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.810792 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.810769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p65kx\" (UniqueName: \"kubernetes.io/projected/053aca76-2395-4ccb-b967-df89fb88ff5c-kube-api-access-p65kx\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:27.970356 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:27.970246 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:28.091883 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:28.091851 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj"] Apr 16 16:20:28.096097 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:20:28.096073 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053aca76_2395_4ccb_b967_df89fb88ff5c.slice/crio-0aee834de0e79b1c26189b270a29863aac49c7edaffe63c4c6e53aa2220398f1 WatchSource:0}: Error finding container 0aee834de0e79b1c26189b270a29863aac49c7edaffe63c4c6e53aa2220398f1: Status 404 returned error can't find the container with id 0aee834de0e79b1c26189b270a29863aac49c7edaffe63c4c6e53aa2220398f1 Apr 16 16:20:29.076411 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:29.076374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" event={"ID":"053aca76-2395-4ccb-b967-df89fb88ff5c","Type":"ContainerStarted","Data":"0aee834de0e79b1c26189b270a29863aac49c7edaffe63c4c6e53aa2220398f1"} Apr 16 16:20:32.087258 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:32.087219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" event={"ID":"053aca76-2395-4ccb-b967-df89fb88ff5c","Type":"ContainerStarted","Data":"1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822"} Apr 16 16:20:36.108888 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:36.108856 2577 generic.go:358] "Generic (PLEG): container finished" podID="053aca76-2395-4ccb-b967-df89fb88ff5c" containerID="1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822" exitCode=0 Apr 16 16:20:36.109254 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:36.108927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" event={"ID":"053aca76-2395-4ccb-b967-df89fb88ff5c","Type":"ContainerDied","Data":"1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822"} Apr 16 16:20:38.116287 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:38.116251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" event={"ID":"053aca76-2395-4ccb-b967-df89fb88ff5c","Type":"ContainerStarted","Data":"c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7"} Apr 16 16:20:38.136633 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:38.136581 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" podStartSLOduration=1.846977373 podStartE2EDuration="11.13656628s" podCreationTimestamp="2026-04-16 16:20:27 +0000 UTC" firstStartedPulling="2026-04-16 16:20:28.097833468 +0000 UTC m=+1184.218491385" lastFinishedPulling="2026-04-16 16:20:37.387422375 +0000 UTC m=+1193.508080292" observedRunningTime="2026-04-16 16:20:38.134178603 +0000 UTC m=+1194.254836544" watchObservedRunningTime="2026-04-16 16:20:38.13656628 +0000 UTC m=+1194.257224218" Apr 16 16:20:44.532998 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:44.532968 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:20:44.533533 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:44.533193 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:20:47.971121 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:47.971088 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:47.971546 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:47.971134 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:47.983562 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:47.983539 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:20:48.156083 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:20:48.156054 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:22:00.805800 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.805721 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n"] Apr 16 16:22:00.809200 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.809185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:00.812226 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.812204 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 16:22:00.821006 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.820978 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n"] Apr 16 16:22:00.975647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.975611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:00.975647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.975651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31692745-5708-443d-b838-811e7a894168-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:00.975874 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.975674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:00.975874 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.975725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:00.975874 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.975739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:00.975874 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:00.975775 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9dp8\" (UniqueName: \"kubernetes.io/projected/31692745-5708-443d-b838-811e7a894168-kube-api-access-b9dp8\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077038 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.076938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9dp8\" (UniqueName: \"kubernetes.io/projected/31692745-5708-443d-b838-811e7a894168-kube-api-access-b9dp8\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077210 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.077071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077210 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.077109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31692745-5708-443d-b838-811e7a894168-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077210 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.077143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077210 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.077185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077210 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.077204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077582 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.077557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077666 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.077569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.077666 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.077623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.079685 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.079663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.079956 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.079939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31692745-5708-443d-b838-811e7a894168-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.090723 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.090694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9dp8\" (UniqueName: \"kubernetes.io/projected/31692745-5708-443d-b838-811e7a894168-kube-api-access-b9dp8\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-j9z9n\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.118668 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.118634 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:01.260668 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.260608 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n"] Apr 16 16:22:01.262939 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:22:01.262910 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31692745_5708_443d_b838_811e7a894168.slice/crio-f1c8a2cfa837b0ec1d6fa33cf362693a2587b46940737ba685f0656fae360124 WatchSource:0}: Error finding container f1c8a2cfa837b0ec1d6fa33cf362693a2587b46940737ba685f0656fae360124: Status 404 returned error can't find the container with id f1c8a2cfa837b0ec1d6fa33cf362693a2587b46940737ba685f0656fae360124 Apr 16 16:22:01.264704 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.264688 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:22:01.355131 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.355057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" event={"ID":"31692745-5708-443d-b838-811e7a894168","Type":"ContainerStarted","Data":"b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f"} Apr 16 16:22:01.355131 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:01.355092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" event={"ID":"31692745-5708-443d-b838-811e7a894168","Type":"ContainerStarted","Data":"f1c8a2cfa837b0ec1d6fa33cf362693a2587b46940737ba685f0656fae360124"} Apr 16 16:22:06.375242 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:06.375194 2577 generic.go:358] "Generic (PLEG): container finished" podID="31692745-5708-443d-b838-811e7a894168" containerID="b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f" exitCode=0 Apr 16 16:22:06.375744 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:06.375272 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" event={"ID":"31692745-5708-443d-b838-811e7a894168","Type":"ContainerDied","Data":"b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f"} Apr 16 16:22:07.379924 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:07.379893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" event={"ID":"31692745-5708-443d-b838-811e7a894168","Type":"ContainerStarted","Data":"b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e"} Apr 16 16:22:07.407211 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:07.407166 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" podStartSLOduration=7.407153109 podStartE2EDuration="7.407153109s" podCreationTimestamp="2026-04-16 16:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:22:07.404187896 +0000 UTC m=+1283.524845835" watchObservedRunningTime="2026-04-16 16:22:07.407153109 +0000 UTC m=+1283.527811048" Apr 16 16:22:11.119051 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:11.119010 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:11.119051 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:11.119050 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:11.131612 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:11.131592 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:11.404858 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:11.404774 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:43.740150 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:43.740108 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n"] Apr 16 16:22:43.740682 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:43.740512 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" podUID="31692745-5708-443d-b838-811e7a894168" containerName="main" containerID="cri-o://b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e" gracePeriod=30 Apr 16 16:22:43.986638 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:43.986617 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:44.035067 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.034982 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-kserve-provision-location\") pod \"31692745-5708-443d-b838-811e7a894168\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " Apr 16 16:22:44.035067 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.035027 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9dp8\" (UniqueName: \"kubernetes.io/projected/31692745-5708-443d-b838-811e7a894168-kube-api-access-b9dp8\") pod \"31692745-5708-443d-b838-811e7a894168\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " Apr 16 16:22:44.035067 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.035061 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31692745-5708-443d-b838-811e7a894168-tls-certs\") pod \"31692745-5708-443d-b838-811e7a894168\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " Apr 16 16:22:44.035317 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.035115 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-home\") pod \"31692745-5708-443d-b838-811e7a894168\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " Apr 16 16:22:44.035317 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.035146 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-model-cache\") pod \"31692745-5708-443d-b838-811e7a894168\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " Apr 16 16:22:44.035317 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.035237 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-dshm\") pod \"31692745-5708-443d-b838-811e7a894168\" (UID: \"31692745-5708-443d-b838-811e7a894168\") " Apr 16 16:22:44.035486 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.035374 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-home" (OuterVolumeSpecName: "home") pod "31692745-5708-443d-b838-811e7a894168" (UID: "31692745-5708-443d-b838-811e7a894168"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:44.035541 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.035523 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:22:44.035587 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.035519 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-model-cache" (OuterVolumeSpecName: "model-cache") pod "31692745-5708-443d-b838-811e7a894168" (UID: "31692745-5708-443d-b838-811e7a894168"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:44.037526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.037492 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-dshm" (OuterVolumeSpecName: "dshm") pod "31692745-5708-443d-b838-811e7a894168" (UID: "31692745-5708-443d-b838-811e7a894168"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:44.037630 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.037567 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31692745-5708-443d-b838-811e7a894168-kube-api-access-b9dp8" (OuterVolumeSpecName: "kube-api-access-b9dp8") pod "31692745-5708-443d-b838-811e7a894168" (UID: "31692745-5708-443d-b838-811e7a894168"). InnerVolumeSpecName "kube-api-access-b9dp8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:22:44.037630 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.037569 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31692745-5708-443d-b838-811e7a894168-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "31692745-5708-443d-b838-811e7a894168" (UID: "31692745-5708-443d-b838-811e7a894168"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:22:44.089904 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.089863 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "31692745-5708-443d-b838-811e7a894168" (UID: "31692745-5708-443d-b838-811e7a894168"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:44.136589 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.136560 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:22:44.136589 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.136584 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:22:44.136589 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.136595 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31692745-5708-443d-b838-811e7a894168-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:22:44.136813 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.136605 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b9dp8\" (UniqueName: \"kubernetes.io/projected/31692745-5708-443d-b838-811e7a894168-kube-api-access-b9dp8\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:22:44.136813 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.136614 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31692745-5708-443d-b838-811e7a894168-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:22:44.506050 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.506014 2577 generic.go:358] "Generic (PLEG): container finished" podID="31692745-5708-443d-b838-811e7a894168" containerID="b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e" exitCode=0 Apr 16 16:22:44.506234 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.506083 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" Apr 16 16:22:44.506234 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.506093 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" event={"ID":"31692745-5708-443d-b838-811e7a894168","Type":"ContainerDied","Data":"b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e"} Apr 16 16:22:44.506234 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.506134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n" event={"ID":"31692745-5708-443d-b838-811e7a894168","Type":"ContainerDied","Data":"f1c8a2cfa837b0ec1d6fa33cf362693a2587b46940737ba685f0656fae360124"} Apr 16 16:22:44.506234 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.506147 2577 scope.go:117] "RemoveContainer" containerID="b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e" Apr 16 16:22:44.515110 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.515092 2577 scope.go:117] "RemoveContainer" containerID="b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f" Apr 16 16:22:44.528978 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.528959 2577 scope.go:117] "RemoveContainer" containerID="b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e" Apr 16 16:22:44.529270 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:22:44.529249 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e\": container with ID starting with b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e not found: ID does not exist" containerID="b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e" Apr 16 16:22:44.529333 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.529280 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e"} err="failed to get container status \"b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e\": rpc error: code = NotFound desc = could not find container \"b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e\": container with ID starting with b17a2d8475d408ec069b6745314c189337c932f5b9c54dab64f6e2ff1a20191e not found: ID does not exist" Apr 16 16:22:44.529333 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.529297 2577 scope.go:117] "RemoveContainer" containerID="b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f" Apr 16 16:22:44.529537 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:22:44.529516 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f\": container with ID starting with b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f not found: ID does not exist" containerID="b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f" Apr 16 16:22:44.529603 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.529546 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f"} err="failed to get container status \"b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f\": rpc error: code = NotFound desc = could not find container \"b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f\": container with ID starting with b1f70ff4938e7e4a7bd474710bdabe4cbec088ad698cedab7cc79050379fc54f not found: ID does not exist" Apr 16 16:22:44.530705 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.530685 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n"] Apr 16 16:22:44.534924 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.534903 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-j9z9n"] Apr 16 16:22:44.550558 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:22:44.550531 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31692745-5708-443d-b838-811e7a894168" path="/var/lib/kubelet/pods/31692745-5708-443d-b838-811e7a894168/volumes" Apr 16 16:23:12.007577 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.007539 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj"] Apr 16 16:23:12.008132 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.007846 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" podUID="053aca76-2395-4ccb-b967-df89fb88ff5c" containerName="main" containerID="cri-o://c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7" gracePeriod=30 Apr 16 16:23:12.253137 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.253111 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:23:12.280664 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.280587 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-model-cache\") pod \"053aca76-2395-4ccb-b967-df89fb88ff5c\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " Apr 16 16:23:12.280664 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.280633 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/053aca76-2395-4ccb-b967-df89fb88ff5c-tls-certs\") pod \"053aca76-2395-4ccb-b967-df89fb88ff5c\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " Apr 16 16:23:12.280864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.280705 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p65kx\" (UniqueName: \"kubernetes.io/projected/053aca76-2395-4ccb-b967-df89fb88ff5c-kube-api-access-p65kx\") pod \"053aca76-2395-4ccb-b967-df89fb88ff5c\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " Apr 16 16:23:12.280864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.280772 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-dshm\") pod \"053aca76-2395-4ccb-b967-df89fb88ff5c\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " Apr 16 16:23:12.280864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.280800 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-home\") pod \"053aca76-2395-4ccb-b967-df89fb88ff5c\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " Apr 16 16:23:12.280864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.280834 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-kserve-provision-location\") pod \"053aca76-2395-4ccb-b967-df89fb88ff5c\" (UID: \"053aca76-2395-4ccb-b967-df89fb88ff5c\") " Apr 16 16:23:12.282351 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.282049 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-model-cache" (OuterVolumeSpecName: "model-cache") pod "053aca76-2395-4ccb-b967-df89fb88ff5c" (UID: "053aca76-2395-4ccb-b967-df89fb88ff5c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:23:12.284313 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.284277 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-home" (OuterVolumeSpecName: "home") pod "053aca76-2395-4ccb-b967-df89fb88ff5c" (UID: "053aca76-2395-4ccb-b967-df89fb88ff5c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:23:12.286437 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.286394 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/053aca76-2395-4ccb-b967-df89fb88ff5c-kube-api-access-p65kx" (OuterVolumeSpecName: "kube-api-access-p65kx") pod "053aca76-2395-4ccb-b967-df89fb88ff5c" (UID: "053aca76-2395-4ccb-b967-df89fb88ff5c"). InnerVolumeSpecName "kube-api-access-p65kx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:23:12.287248 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.287218 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/053aca76-2395-4ccb-b967-df89fb88ff5c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "053aca76-2395-4ccb-b967-df89fb88ff5c" (UID: "053aca76-2395-4ccb-b967-df89fb88ff5c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:23:12.287600 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.287573 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-dshm" (OuterVolumeSpecName: "dshm") pod "053aca76-2395-4ccb-b967-df89fb88ff5c" (UID: "053aca76-2395-4ccb-b967-df89fb88ff5c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:23:12.351521 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.351461 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "053aca76-2395-4ccb-b967-df89fb88ff5c" (UID: "053aca76-2395-4ccb-b967-df89fb88ff5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:23:12.381869 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.381839 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:23:12.381869 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.381868 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:23:12.382026 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.381879 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:23:12.382026 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.381915 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/053aca76-2395-4ccb-b967-df89fb88ff5c-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:23:12.382026 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.381929 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/053aca76-2395-4ccb-b967-df89fb88ff5c-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:23:12.382026 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.381941 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p65kx\" (UniqueName: \"kubernetes.io/projected/053aca76-2395-4ccb-b967-df89fb88ff5c-kube-api-access-p65kx\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:23:12.587059 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.587025 2577 generic.go:358] "Generic (PLEG): container finished" podID="053aca76-2395-4ccb-b967-df89fb88ff5c" containerID="c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7" exitCode=0 Apr 16 16:23:12.587237 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.587111 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" Apr 16 16:23:12.587237 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.587115 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" event={"ID":"053aca76-2395-4ccb-b967-df89fb88ff5c","Type":"ContainerDied","Data":"c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7"} Apr 16 16:23:12.587237 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.587156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj" event={"ID":"053aca76-2395-4ccb-b967-df89fb88ff5c","Type":"ContainerDied","Data":"0aee834de0e79b1c26189b270a29863aac49c7edaffe63c4c6e53aa2220398f1"} Apr 16 16:23:12.587237 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.587176 2577 scope.go:117] "RemoveContainer" containerID="c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7" Apr 16 16:23:12.595348 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.595329 2577 scope.go:117] "RemoveContainer" containerID="1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822" Apr 16 16:23:12.615877 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.615853 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj"] Apr 16 16:23:12.625219 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.625199 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-nk5tj"] Apr 16 16:23:12.654797 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.654777 2577 scope.go:117] "RemoveContainer" containerID="c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7" Apr 16 16:23:12.655123 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:23:12.655098 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7\": container with ID starting with c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7 not found: ID does not exist" containerID="c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7" Apr 16 16:23:12.655171 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.655135 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7"} err="failed to get container status \"c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7\": rpc error: code = NotFound desc = could not find container \"c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7\": container with ID starting with c2d3203b2f054107283fb88986f0b0dfd653da80c322423739efc0296691b5b7 not found: ID does not exist" Apr 16 16:23:12.655171 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.655156 2577 scope.go:117] "RemoveContainer" containerID="1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822" Apr 16 16:23:12.655459 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:23:12.655424 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822\": container with ID starting with 1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822 not found: ID does not exist" containerID="1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822" Apr 16 16:23:12.655526 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:12.655469 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822"} err="failed to get container status \"1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822\": rpc error: code = NotFound desc = could not find container \"1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822\": container with ID starting with 1091b5481f08fbf3414a5f8070ae56dcd5d0b430e0b691dd376ded0e687c6822 not found: ID does not exist" Apr 16 16:23:14.551672 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:14.551643 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="053aca76-2395-4ccb-b967-df89fb88ff5c" path="/var/lib/kubelet/pods/053aca76-2395-4ccb-b967-df89fb88ff5c/volumes" Apr 16 16:23:29.861884 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.861847 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6"] Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862143 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31692745-5708-443d-b838-811e7a894168" containerName="main" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862154 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="31692745-5708-443d-b838-811e7a894168" containerName="main" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862169 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31692745-5708-443d-b838-811e7a894168" containerName="storage-initializer" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862174 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="31692745-5708-443d-b838-811e7a894168" containerName="storage-initializer" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862183 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="053aca76-2395-4ccb-b967-df89fb88ff5c" containerName="main" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862189 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="053aca76-2395-4ccb-b967-df89fb88ff5c" containerName="main" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862201 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="053aca76-2395-4ccb-b967-df89fb88ff5c" containerName="storage-initializer" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862206 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="053aca76-2395-4ccb-b967-df89fb88ff5c" containerName="storage-initializer" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862255 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="053aca76-2395-4ccb-b967-df89fb88ff5c" containerName="main" Apr 16 16:23:29.862406 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.862263 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="31692745-5708-443d-b838-811e7a894168" containerName="main" Apr 16 16:23:29.865420 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.865402 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:29.868297 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.868275 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:23:29.868410 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.868275 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 16:23:29.868410 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.868276 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:23:29.868410 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.868283 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:23:29.875872 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.875849 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6"] Apr 16 16:23:29.924375 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.924343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:29.924555 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.924380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56l95\" (UniqueName: \"kubernetes.io/projected/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kube-api-access-56l95\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:29.924555 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.924419 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:29.924555 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.924475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:29.924555 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.924521 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:29.924555 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:29.924554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.025306 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.025261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.025306 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.025308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.025562 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.025344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.025619 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.025597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.025724 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.025707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.025780 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.025737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56l95\" (UniqueName: \"kubernetes.io/projected/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kube-api-access-56l95\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.025780 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.025768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.025953 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.025932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.026020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.026002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.028337 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.028314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.028599 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.028584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.034567 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.034545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56l95\" (UniqueName: \"kubernetes.io/projected/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kube-api-access-56l95\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.177063 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.176990 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:23:30.301168 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.301141 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6"] Apr 16 16:23:30.303763 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:23:30.303734 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215eab93_9ac9_46f7_a01a_8d2d1e8aef2a.slice/crio-bc79e68160c26ae982064931bd5895f9dbfeb9010f36ec3770f8bb3315c39e35 WatchSource:0}: Error finding container bc79e68160c26ae982064931bd5895f9dbfeb9010f36ec3770f8bb3315c39e35: Status 404 returned error can't find the container with id bc79e68160c26ae982064931bd5895f9dbfeb9010f36ec3770f8bb3315c39e35 Apr 16 16:23:30.641553 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.641519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" event={"ID":"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a","Type":"ContainerStarted","Data":"9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc"} Apr 16 16:23:30.641553 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:30.641555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" event={"ID":"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a","Type":"ContainerStarted","Data":"bc79e68160c26ae982064931bd5895f9dbfeb9010f36ec3770f8bb3315c39e35"} Apr 16 16:23:34.655413 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:34.655379 2577 generic.go:358] "Generic (PLEG): container finished" podID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerID="9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc" exitCode=0 Apr 16 16:23:34.655793 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:23:34.655443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" event={"ID":"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a","Type":"ContainerDied","Data":"9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc"} Apr 16 16:24:19.802087 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:24:19.802046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" event={"ID":"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a","Type":"ContainerStarted","Data":"a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e"} Apr 16 16:24:19.823734 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:24:19.823686 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podStartSLOduration=5.891475688 podStartE2EDuration="50.823670089s" podCreationTimestamp="2026-04-16 16:23:29 +0000 UTC" firstStartedPulling="2026-04-16 16:23:34.656554897 +0000 UTC m=+1370.777212818" lastFinishedPulling="2026-04-16 16:24:19.588749297 +0000 UTC m=+1415.709407219" observedRunningTime="2026-04-16 16:24:19.821590241 +0000 UTC m=+1415.942248192" watchObservedRunningTime="2026-04-16 16:24:19.823670089 +0000 UTC m=+1415.944328029" Apr 16 16:24:20.177363 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:24:20.177321 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:24:20.177363 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:24:20.177376 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:24:20.178810 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:24:20.178776 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:24:30.178116 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:24:30.178071 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:24:40.178021 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:24:40.177981 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:24:50.178460 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:24:50.178415 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:25:00.178432 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:00.178383 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:25:10.177743 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:10.177699 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:25:20.178010 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:20.177958 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:25:30.178261 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:30.178219 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:25:40.177829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:40.177786 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 16:25:43.458392 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.458351 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8"] Apr 16 16:25:43.463355 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.463331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.465958 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.465940 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 16:25:43.473833 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.473799 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8"] Apr 16 16:25:43.496106 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.496080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.496106 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.496109 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5811cfc4-595e-4b91-8542-6c6ffaa1198d-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.496321 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.496128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.496321 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.496144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.496321 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.496214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-home\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.496321 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.496251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4tkm\" (UniqueName: \"kubernetes.io/projected/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kube-api-access-n4tkm\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597185 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-home\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597386 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597205 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4tkm\" (UniqueName: \"kubernetes.io/projected/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kube-api-access-n4tkm\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597551 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597623 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5811cfc4-595e-4b91-8542-6c6ffaa1198d-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597623 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597623 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-home\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597783 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597871 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.597928 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.597899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.599865 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.599839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.600257 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.600234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5811cfc4-595e-4b91-8542-6c6ffaa1198d-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.605637 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.605616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4tkm\" (UniqueName: \"kubernetes.io/projected/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kube-api-access-n4tkm\") pod \"stop-feature-test-kserve-85568b7f4f-p8hc8\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.774312 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.774214 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:43.901487 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:43.901461 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8"] Apr 16 16:25:43.903841 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:25:43.903814 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5811cfc4_595e_4b91_8542_6c6ffaa1198d.slice/crio-5aa2e467bcbec845966ffeb3cf15ce1cb29a0eca9b1b7b84582c9014f0916026 WatchSource:0}: Error finding container 5aa2e467bcbec845966ffeb3cf15ce1cb29a0eca9b1b7b84582c9014f0916026: Status 404 returned error can't find the container with id 5aa2e467bcbec845966ffeb3cf15ce1cb29a0eca9b1b7b84582c9014f0916026 Apr 16 16:25:44.059682 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:44.059578 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" event={"ID":"5811cfc4-595e-4b91-8542-6c6ffaa1198d","Type":"ContainerStarted","Data":"8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a"} Apr 16 16:25:44.059682 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:44.059625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" event={"ID":"5811cfc4-595e-4b91-8542-6c6ffaa1198d","Type":"ContainerStarted","Data":"5aa2e467bcbec845966ffeb3cf15ce1cb29a0eca9b1b7b84582c9014f0916026"} Apr 16 16:25:44.558245 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:44.558215 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:25:44.561028 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:44.561007 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:25:49.076110 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:49.076072 2577 generic.go:358] "Generic (PLEG): container finished" podID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerID="8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a" exitCode=0 Apr 16 16:25:49.076564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:49.076148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" event={"ID":"5811cfc4-595e-4b91-8542-6c6ffaa1198d","Type":"ContainerDied","Data":"8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a"} Apr 16 16:25:50.080356 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:50.080320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" event={"ID":"5811cfc4-595e-4b91-8542-6c6ffaa1198d","Type":"ContainerStarted","Data":"9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112"} Apr 16 16:25:50.103299 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:50.103244 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podStartSLOduration=7.103226289 podStartE2EDuration="7.103226289s" podCreationTimestamp="2026-04-16 16:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:25:50.099929541 +0000 UTC m=+1506.220587481" watchObservedRunningTime="2026-04-16 16:25:50.103226289 +0000 UTC m=+1506.223884229" Apr 16 16:25:50.187672 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:50.187644 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:25:50.195758 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:50.195733 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:25:53.774597 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:53.774545 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:53.775115 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:53.774632 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:25:53.776117 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:53.776085 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:25:57.132422 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:57.132382 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6"] Apr 16 16:25:57.132889 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:25:57.132692 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" containerID="cri-o://a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e" gracePeriod=30 Apr 16 16:26:03.774735 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:03.774629 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:26:13.775083 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:13.775043 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:26:23.774942 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:23.774900 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:26:27.385840 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.385813 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6_215eab93-9ac9-46f7-a01a-8d2d1e8aef2a/main/0.log" Apr 16 16:26:27.386202 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.386186 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:26:27.391176 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.391125 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-dshm\") pod \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " Apr 16 16:26:27.391249 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.391171 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-model-cache\") pod \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " Apr 16 16:26:27.391249 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.391223 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56l95\" (UniqueName: \"kubernetes.io/projected/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kube-api-access-56l95\") pod \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " Apr 16 16:26:27.391338 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.391255 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kserve-provision-location\") pod \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " Apr 16 16:26:27.391338 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.391316 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-home\") pod \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " Apr 16 16:26:27.391437 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.391371 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-tls-certs\") pod \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\" (UID: \"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a\") " Apr 16 16:26:27.391809 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.391775 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-home" (OuterVolumeSpecName: "home") pod "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" (UID: "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:27.392084 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.392048 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-model-cache" (OuterVolumeSpecName: "model-cache") pod "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" (UID: "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:27.393529 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.393493 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-dshm" (OuterVolumeSpecName: "dshm") pod "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" (UID: "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:27.393774 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.393750 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" (UID: "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:27.394148 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.394123 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kube-api-access-56l95" (OuterVolumeSpecName: "kube-api-access-56l95") pod "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" (UID: "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a"). InnerVolumeSpecName "kube-api-access-56l95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:26:27.452039 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.451989 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" (UID: "215eab93-9ac9-46f7-a01a-8d2d1e8aef2a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:27.492051 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.492020 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:26:27.492051 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.492047 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:26:27.492205 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.492057 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:26:27.492205 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.492067 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-56l95\" (UniqueName: \"kubernetes.io/projected/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kube-api-access-56l95\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:26:27.492205 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.492078 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:26:27.492205 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:27.492086 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:26:28.201225 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.201193 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6_215eab93-9ac9-46f7-a01a-8d2d1e8aef2a/main/0.log" Apr 16 16:26:28.201539 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.201511 2577 generic.go:358] "Generic (PLEG): container finished" podID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerID="a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e" exitCode=137 Apr 16 16:26:28.201633 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.201576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" event={"ID":"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a","Type":"ContainerDied","Data":"a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e"} Apr 16 16:26:28.201633 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.201604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" event={"ID":"215eab93-9ac9-46f7-a01a-8d2d1e8aef2a","Type":"ContainerDied","Data":"bc79e68160c26ae982064931bd5895f9dbfeb9010f36ec3770f8bb3315c39e35"} Apr 16 16:26:28.201633 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.201606 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6" Apr 16 16:26:28.201633 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.201618 2577 scope.go:117] "RemoveContainer" containerID="a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e" Apr 16 16:26:28.220672 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.220649 2577 scope.go:117] "RemoveContainer" containerID="9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc" Apr 16 16:26:28.225177 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.225153 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6"] Apr 16 16:26:28.230981 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.230962 2577 scope.go:117] "RemoveContainer" containerID="a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e" Apr 16 16:26:28.231315 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:26:28.231292 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e\": container with ID starting with a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e not found: ID does not exist" containerID="a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e" Apr 16 16:26:28.232098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.231331 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e"} err="failed to get container status \"a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e\": rpc error: code = NotFound desc = could not find container \"a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e\": container with ID starting with a964a7d628623be100ebe45c1f7e58c32a696ea8b3b6b4ce4f5f4ad46b6c665e not found: ID does not exist" Apr 16 16:26:28.232098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.231361 2577 scope.go:117] "RemoveContainer" containerID="9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc" Apr 16 16:26:28.232098 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:26:28.231726 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc\": container with ID starting with 9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc not found: ID does not exist" containerID="9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc" Apr 16 16:26:28.232098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.231754 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc"} err="failed to get container status \"9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc\": rpc error: code = NotFound desc = could not find container \"9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc\": container with ID starting with 9c38d9a9d8e5b69825879c9cd912d18a6d0c8946bdef1066185c052ae05aaddc not found: ID does not exist" Apr 16 16:26:28.234460 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.234421 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-cscb6"] Apr 16 16:26:28.551819 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:28.551729 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" path="/var/lib/kubelet/pods/215eab93-9ac9-46f7-a01a-8d2d1e8aef2a/volumes" Apr 16 16:26:33.775145 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:33.775107 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:26:43.774883 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:43.774842 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:26:53.775111 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:26:53.775066 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:27:03.775151 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:03.775102 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:27:13.775273 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:13.775232 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 16:27:23.784241 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:23.784211 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:27:23.791946 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:23.791920 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:27:24.919774 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:24.919740 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8"] Apr 16 16:27:25.373438 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:25.373400 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" containerID="cri-o://9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112" gracePeriod=30 Apr 16 16:27:55.631259 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.631192 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85568b7f4f-p8hc8_5811cfc4-595e-4b91-8542-6c6ffaa1198d/main/0.log" Apr 16 16:27:55.631615 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.631572 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:27:55.713184 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.713152 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kserve-provision-location\") pod \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " Apr 16 16:27:55.713346 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.713203 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5811cfc4-595e-4b91-8542-6c6ffaa1198d-tls-certs\") pod \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " Apr 16 16:27:55.713346 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.713260 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-dshm\") pod \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " Apr 16 16:27:55.713346 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.713303 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4tkm\" (UniqueName: \"kubernetes.io/projected/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kube-api-access-n4tkm\") pod \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " Apr 16 16:27:55.713346 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.713330 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-model-cache\") pod \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " Apr 16 16:27:55.713578 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.713363 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-home\") pod \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\" (UID: \"5811cfc4-595e-4b91-8542-6c6ffaa1198d\") " Apr 16 16:27:55.713642 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.713614 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-model-cache" (OuterVolumeSpecName: "model-cache") pod "5811cfc4-595e-4b91-8542-6c6ffaa1198d" (UID: "5811cfc4-595e-4b91-8542-6c6ffaa1198d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:55.713822 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.713791 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-home" (OuterVolumeSpecName: "home") pod "5811cfc4-595e-4b91-8542-6c6ffaa1198d" (UID: "5811cfc4-595e-4b91-8542-6c6ffaa1198d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:55.715730 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.715704 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5811cfc4-595e-4b91-8542-6c6ffaa1198d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5811cfc4-595e-4b91-8542-6c6ffaa1198d" (UID: "5811cfc4-595e-4b91-8542-6c6ffaa1198d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:27:55.715730 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.715714 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kube-api-access-n4tkm" (OuterVolumeSpecName: "kube-api-access-n4tkm") pod "5811cfc4-595e-4b91-8542-6c6ffaa1198d" (UID: "5811cfc4-595e-4b91-8542-6c6ffaa1198d"). InnerVolumeSpecName "kube-api-access-n4tkm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:27:55.715859 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.715794 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-dshm" (OuterVolumeSpecName: "dshm") pod "5811cfc4-595e-4b91-8542-6c6ffaa1198d" (UID: "5811cfc4-595e-4b91-8542-6c6ffaa1198d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:55.768803 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.768775 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5811cfc4-595e-4b91-8542-6c6ffaa1198d" (UID: "5811cfc4-595e-4b91-8542-6c6ffaa1198d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:55.814675 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.814651 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4tkm\" (UniqueName: \"kubernetes.io/projected/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kube-api-access-n4tkm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:27:55.814675 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.814675 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:27:55.814795 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.814686 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:27:55.814795 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.814695 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:27:55.814795 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.814704 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5811cfc4-595e-4b91-8542-6c6ffaa1198d-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:27:55.814795 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:55.814713 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5811cfc4-595e-4b91-8542-6c6ffaa1198d-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:27:56.471878 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.471850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85568b7f4f-p8hc8_5811cfc4-595e-4b91-8542-6c6ffaa1198d/main/0.log" Apr 16 16:27:56.472190 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.472165 2577 generic.go:358] "Generic (PLEG): container finished" podID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerID="9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112" exitCode=137 Apr 16 16:27:56.472256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.472209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" event={"ID":"5811cfc4-595e-4b91-8542-6c6ffaa1198d","Type":"ContainerDied","Data":"9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112"} Apr 16 16:27:56.472256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.472243 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" Apr 16 16:27:56.472327 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.472255 2577 scope.go:117] "RemoveContainer" containerID="9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112" Apr 16 16:27:56.472379 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.472243 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8" event={"ID":"5811cfc4-595e-4b91-8542-6c6ffaa1198d","Type":"ContainerDied","Data":"5aa2e467bcbec845966ffeb3cf15ce1cb29a0eca9b1b7b84582c9014f0916026"} Apr 16 16:27:56.491469 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.491428 2577 scope.go:117] "RemoveContainer" containerID="8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a" Apr 16 16:27:56.495357 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.495333 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8"] Apr 16 16:27:56.500942 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.500918 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-p8hc8"] Apr 16 16:27:56.502397 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.502385 2577 scope.go:117] "RemoveContainer" containerID="9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112" Apr 16 16:27:56.502650 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:27:56.502631 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112\": container with ID starting with 9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112 not found: ID does not exist" containerID="9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112" Apr 16 16:27:56.502707 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.502657 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112"} err="failed to get container status \"9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112\": rpc error: code = NotFound desc = could not find container \"9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112\": container with ID starting with 9eac7e0bc39188f57af9492c002b2b5de6e065f5e7bd2fa168893537f7d83112 not found: ID does not exist" Apr 16 16:27:56.502707 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.502673 2577 scope.go:117] "RemoveContainer" containerID="8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a" Apr 16 16:27:56.502915 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:27:56.502898 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a\": container with ID starting with 8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a not found: ID does not exist" containerID="8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a" Apr 16 16:27:56.502960 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.502921 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a"} err="failed to get container status \"8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a\": rpc error: code = NotFound desc = could not find container \"8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a\": container with ID starting with 8bfd0411e97452242094e240467ba11f0453f2531d4538afa5feef4dde66127a not found: ID does not exist" Apr 16 16:27:56.549701 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:27:56.549672 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" path="/var/lib/kubelet/pods/5811cfc4-595e-4b91-8542-6c6ffaa1198d/volumes" Apr 16 16:28:39.262065 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262031 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm"] Apr 16 16:28:39.262564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262438 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" Apr 16 16:28:39.262564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262474 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" Apr 16 16:28:39.262564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262498 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="storage-initializer" Apr 16 16:28:39.262564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262507 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="storage-initializer" Apr 16 16:28:39.262564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262519 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="storage-initializer" Apr 16 16:28:39.262564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262528 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="storage-initializer" Apr 16 16:28:39.262564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262536 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" Apr 16 16:28:39.262564 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262541 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" Apr 16 16:28:39.262856 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262609 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="215eab93-9ac9-46f7-a01a-8d2d1e8aef2a" containerName="main" Apr 16 16:28:39.262856 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.262621 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5811cfc4-595e-4b91-8542-6c6ffaa1198d" containerName="main" Apr 16 16:28:39.265952 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.265936 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.268704 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.268678 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:28:39.268950 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.268933 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:28:39.269065 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.269049 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 16:28:39.269953 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.269936 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:28:39.276725 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.276702 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm"] Apr 16 16:28:39.359744 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.359713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5367a5-a14d-423d-8504-c36f1efc187e-tls-certs\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.359744 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.359744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hwnq\" (UniqueName: \"kubernetes.io/projected/9c5367a5-a14d-423d-8504-c36f1efc187e-kube-api-access-8hwnq\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.359918 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.359771 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-model-cache\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.359918 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.359799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-dshm\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.359918 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.359861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-home\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.359918 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.359889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-kserve-provision-location\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.460845 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.460810 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5367a5-a14d-423d-8504-c36f1efc187e-tls-certs\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.460845 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.460848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hwnq\" (UniqueName: \"kubernetes.io/projected/9c5367a5-a14d-423d-8504-c36f1efc187e-kube-api-access-8hwnq\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.461073 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.460982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-model-cache\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.461073 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.461018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-dshm\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.461178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.461083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-home\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.461178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.461108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-kserve-provision-location\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.461416 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.461395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-model-cache\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.461524 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.461478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-kserve-provision-location\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.461524 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.461515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-home\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.463278 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.463260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-dshm\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.463436 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.463419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5367a5-a14d-423d-8504-c36f1efc187e-tls-certs\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.472427 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.472407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hwnq\" (UniqueName: \"kubernetes.io/projected/9c5367a5-a14d-423d-8504-c36f1efc187e-kube-api-access-8hwnq\") pod \"router-with-refs-test-kserve-9bbdcd449-6rfdm\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.577894 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.577806 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:39.706469 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.706427 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm"] Apr 16 16:28:39.708782 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:28:39.708755 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c5367a5_a14d_423d_8504_c36f1efc187e.slice/crio-cfacdf3f84a6685a7d40f8cfc4cc084626bba2b5bda7df256d9955c27803a714 WatchSource:0}: Error finding container cfacdf3f84a6685a7d40f8cfc4cc084626bba2b5bda7df256d9955c27803a714: Status 404 returned error can't find the container with id cfacdf3f84a6685a7d40f8cfc4cc084626bba2b5bda7df256d9955c27803a714 Apr 16 16:28:39.710491 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:39.710437 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:28:40.598845 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:40.598810 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" event={"ID":"9c5367a5-a14d-423d-8504-c36f1efc187e","Type":"ContainerStarted","Data":"b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee"} Apr 16 16:28:40.598845 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:40.598845 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" event={"ID":"9c5367a5-a14d-423d-8504-c36f1efc187e","Type":"ContainerStarted","Data":"cfacdf3f84a6685a7d40f8cfc4cc084626bba2b5bda7df256d9955c27803a714"} Apr 16 16:28:44.611299 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:44.611269 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerID="b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee" exitCode=0 Apr 16 16:28:44.611673 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:44.611304 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" event={"ID":"9c5367a5-a14d-423d-8504-c36f1efc187e","Type":"ContainerDied","Data":"b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee"} Apr 16 16:28:45.615657 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:45.615623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" event={"ID":"9c5367a5-a14d-423d-8504-c36f1efc187e","Type":"ContainerStarted","Data":"ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1"} Apr 16 16:28:45.638608 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:45.638553 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podStartSLOduration=6.638537008 podStartE2EDuration="6.638537008s" podCreationTimestamp="2026-04-16 16:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:28:45.637047089 +0000 UTC m=+1681.757705041" watchObservedRunningTime="2026-04-16 16:28:45.638537008 +0000 UTC m=+1681.759194946" Apr 16 16:28:49.578300 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:49.578259 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:49.578300 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:49.578304 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:28:49.579831 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:49.579798 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:28:59.578880 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:28:59.578840 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:29:00.579819 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.579782 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9"] Apr 16 16:29:00.607660 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.607538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9"] Apr 16 16:29:00.607800 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.607692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.610869 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.610847 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-vssgs\"" Apr 16 16:29:00.611031 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.611010 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 16:29:00.727042 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.726999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-dshm\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.727248 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.727051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ttkj\" (UniqueName: \"kubernetes.io/projected/84a0abf8-0351-49e3-b555-f123355bc770-kube-api-access-6ttkj\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.727248 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.727129 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-model-cache\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.727248 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.727176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-home\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.727248 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.727221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a0abf8-0351-49e3-b555-f123355bc770-tls-certs\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.727408 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.727256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828125 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ttkj\" (UniqueName: \"kubernetes.io/projected/84a0abf8-0351-49e3-b555-f123355bc770-kube-api-access-6ttkj\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828125 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-model-cache\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828372 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-home\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828372 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a0abf8-0351-49e3-b555-f123355bc770-tls-certs\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828372 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828372 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-dshm\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828676 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828648 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-home\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828789 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.828789 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.828731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-model-cache\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.830597 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.830553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-dshm\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.830819 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.830802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a0abf8-0351-49e3-b555-f123355bc770-tls-certs\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.835784 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.835763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ttkj\" (UniqueName: \"kubernetes.io/projected/84a0abf8-0351-49e3-b555-f123355bc770-kube-api-access-6ttkj\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-dgnc9\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:00.925426 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:00.925388 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:01.053577 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:01.053384 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9"] Apr 16 16:29:01.056416 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:29:01.056383 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a0abf8_0351_49e3_b555_f123355bc770.slice/crio-ddff44c0d649751f7fbea2f4df8bc6ab8b6e8a142b57f9e3f2c5b61204448fcc WatchSource:0}: Error finding container ddff44c0d649751f7fbea2f4df8bc6ab8b6e8a142b57f9e3f2c5b61204448fcc: Status 404 returned error can't find the container with id ddff44c0d649751f7fbea2f4df8bc6ab8b6e8a142b57f9e3f2c5b61204448fcc Apr 16 16:29:01.664848 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:01.664634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" event={"ID":"84a0abf8-0351-49e3-b555-f123355bc770","Type":"ContainerStarted","Data":"ddff44c0d649751f7fbea2f4df8bc6ab8b6e8a142b57f9e3f2c5b61204448fcc"} Apr 16 16:29:02.670401 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:02.670368 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" event={"ID":"84a0abf8-0351-49e3-b555-f123355bc770","Type":"ContainerStarted","Data":"e05d2ee1280370ffda2ae9fdf8353f79174af78ae1f5b3acf7435c26c9950705"} Apr 16 16:29:02.670788 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:02.670540 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:03.675684 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:03.675645 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" event={"ID":"84a0abf8-0351-49e3-b555-f123355bc770","Type":"ContainerStarted","Data":"eafa8c38632b0cc13a8b10027a1c34125ebd8dd9902fa59d48b98cdd8aa1e46b"} Apr 16 16:29:07.689271 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:07.689236 2577 generic.go:358] "Generic (PLEG): container finished" podID="84a0abf8-0351-49e3-b555-f123355bc770" containerID="eafa8c38632b0cc13a8b10027a1c34125ebd8dd9902fa59d48b98cdd8aa1e46b" exitCode=0 Apr 16 16:29:07.689698 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:07.689309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" event={"ID":"84a0abf8-0351-49e3-b555-f123355bc770","Type":"ContainerDied","Data":"eafa8c38632b0cc13a8b10027a1c34125ebd8dd9902fa59d48b98cdd8aa1e46b"} Apr 16 16:29:08.694767 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:08.694723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" event={"ID":"84a0abf8-0351-49e3-b555-f123355bc770","Type":"ContainerStarted","Data":"c4a5ae6cfbe68dcbd00d6bdef82a2cf6725e0ac391e03b6d02a6a0b52e56653e"} Apr 16 16:29:08.718720 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:08.718669 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podStartSLOduration=7.827741272 podStartE2EDuration="8.71865159s" podCreationTimestamp="2026-04-16 16:29:00 +0000 UTC" firstStartedPulling="2026-04-16 16:29:01.058504651 +0000 UTC m=+1697.179162571" lastFinishedPulling="2026-04-16 16:29:01.949414972 +0000 UTC m=+1698.070072889" observedRunningTime="2026-04-16 16:29:08.716778668 +0000 UTC m=+1704.837436631" watchObservedRunningTime="2026-04-16 16:29:08.71865159 +0000 UTC m=+1704.839309531" Apr 16 16:29:09.578843 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:09.578799 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:29:10.925962 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:10.925921 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:10.925962 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:10.925978 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:10.927484 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:10.927429 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:29:19.579158 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:19.579111 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:29:20.926058 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:20.926010 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:29:20.944838 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:20.944808 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:29:29.578403 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:29.578358 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:29:30.926067 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:30.926026 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:29:39.578872 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:39.578829 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:29:40.926531 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:40.926470 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:29:49.579157 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:49.579106 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:29:50.926546 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:50.926487 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:29:59.578527 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:29:59.578483 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:30:00.926572 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:00.926522 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:30:09.578286 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:09.578248 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:30:10.926613 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:10.926567 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:30:19.579025 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:19.578976 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:30:20.926411 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:20.926362 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:30:29.579117 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:29.579076 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 16:30:30.926343 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:30.926243 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:30:39.588786 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:39.588746 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:30:39.597488 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:39.597461 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:30:40.926733 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:40.926682 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:30:44.581804 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:44.581770 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:30:44.585556 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:44.585534 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:30:48.366152 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:48.366122 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm"] Apr 16 16:30:48.366582 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:48.366400 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" containerID="cri-o://ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1" gracePeriod=30 Apr 16 16:30:50.925930 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:30:50.925877 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8001/health\": dial tcp 10.132.0.32:8001: connect: connection refused" Apr 16 16:31:00.941534 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:00.941504 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:31:00.953377 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:00.953355 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:31:12.387876 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.387845 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4"] Apr 16 16:31:12.391377 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.391352 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.394556 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.394531 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 16:31:12.409411 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.409383 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4"] Apr 16 16:31:12.578153 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.578125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f1951-9d17-45a7-8075-e427281ab2c3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.578319 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.578159 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.578319 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.578174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.578319 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.578286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.578426 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.578330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.578426 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.578385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45zld\" (UniqueName: \"kubernetes.io/projected/e52f1951-9d17-45a7-8075-e427281ab2c3-kube-api-access-45zld\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.679761 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.679687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f1951-9d17-45a7-8075-e427281ab2c3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.679761 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.679722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.679761 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.679739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.680005 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.679797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.680005 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.679843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.680005 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.679890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45zld\" (UniqueName: \"kubernetes.io/projected/e52f1951-9d17-45a7-8075-e427281ab2c3-kube-api-access-45zld\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.680161 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.680073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.680239 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.680208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.680298 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.680248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.682208 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.682182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.682384 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.682368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f1951-9d17-45a7-8075-e427281ab2c3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.695712 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.695687 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45zld\" (UniqueName: \"kubernetes.io/projected/e52f1951-9d17-45a7-8075-e427281ab2c3-kube-api-access-45zld\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.702476 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.702442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:12.828113 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:12.828087 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4"] Apr 16 16:31:12.830655 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:31:12.830627 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52f1951_9d17_45a7_8075_e427281ab2c3.slice/crio-7402263911b0eab2f381b2f0511eed7763537cd968a6549d87dcf19849c32d68 WatchSource:0}: Error finding container 7402263911b0eab2f381b2f0511eed7763537cd968a6549d87dcf19849c32d68: Status 404 returned error can't find the container with id 7402263911b0eab2f381b2f0511eed7763537cd968a6549d87dcf19849c32d68 Apr 16 16:31:13.083299 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:13.083208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" event={"ID":"e52f1951-9d17-45a7-8075-e427281ab2c3","Type":"ContainerStarted","Data":"4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914"} Apr 16 16:31:13.083299 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:13.083255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" event={"ID":"e52f1951-9d17-45a7-8075-e427281ab2c3","Type":"ContainerStarted","Data":"7402263911b0eab2f381b2f0511eed7763537cd968a6549d87dcf19849c32d68"} Apr 16 16:31:17.098160 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:17.098074 2577 generic.go:358] "Generic (PLEG): container finished" podID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerID="4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914" exitCode=0 Apr 16 16:31:17.098160 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:17.098148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" event={"ID":"e52f1951-9d17-45a7-8075-e427281ab2c3","Type":"ContainerDied","Data":"4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914"} Apr 16 16:31:18.104177 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.104144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" event={"ID":"e52f1951-9d17-45a7-8075-e427281ab2c3","Type":"ContainerStarted","Data":"f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c"} Apr 16 16:31:18.125518 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.125477 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podStartSLOduration=6.125437185 podStartE2EDuration="6.125437185s" podCreationTimestamp="2026-04-16 16:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:31:18.123173177 +0000 UTC m=+1834.243831129" watchObservedRunningTime="2026-04-16 16:31:18.125437185 +0000 UTC m=+1834.246095121" Apr 16 16:31:18.645872 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.645838 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-9bbdcd449-6rfdm_9c5367a5-a14d-423d-8504-c36f1efc187e/main/0.log" Apr 16 16:31:18.646250 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.646231 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:31:18.727923 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.727895 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5367a5-a14d-423d-8504-c36f1efc187e-tls-certs\") pod \"9c5367a5-a14d-423d-8504-c36f1efc187e\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " Apr 16 16:31:18.727923 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.727930 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hwnq\" (UniqueName: \"kubernetes.io/projected/9c5367a5-a14d-423d-8504-c36f1efc187e-kube-api-access-8hwnq\") pod \"9c5367a5-a14d-423d-8504-c36f1efc187e\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " Apr 16 16:31:18.728172 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.727948 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-home\") pod \"9c5367a5-a14d-423d-8504-c36f1efc187e\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " Apr 16 16:31:18.728172 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.727972 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-kserve-provision-location\") pod \"9c5367a5-a14d-423d-8504-c36f1efc187e\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " Apr 16 16:31:18.728172 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.728041 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-model-cache\") pod \"9c5367a5-a14d-423d-8504-c36f1efc187e\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " Apr 16 16:31:18.728172 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.728079 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-dshm\") pod \"9c5367a5-a14d-423d-8504-c36f1efc187e\" (UID: \"9c5367a5-a14d-423d-8504-c36f1efc187e\") " Apr 16 16:31:18.728385 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.728330 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-home" (OuterVolumeSpecName: "home") pod "9c5367a5-a14d-423d-8504-c36f1efc187e" (UID: "9c5367a5-a14d-423d-8504-c36f1efc187e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:18.728558 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.728514 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-model-cache" (OuterVolumeSpecName: "model-cache") pod "9c5367a5-a14d-423d-8504-c36f1efc187e" (UID: "9c5367a5-a14d-423d-8504-c36f1efc187e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:18.730298 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.730266 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5367a5-a14d-423d-8504-c36f1efc187e-kube-api-access-8hwnq" (OuterVolumeSpecName: "kube-api-access-8hwnq") pod "9c5367a5-a14d-423d-8504-c36f1efc187e" (UID: "9c5367a5-a14d-423d-8504-c36f1efc187e"). InnerVolumeSpecName "kube-api-access-8hwnq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:18.730298 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.730286 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5367a5-a14d-423d-8504-c36f1efc187e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9c5367a5-a14d-423d-8504-c36f1efc187e" (UID: "9c5367a5-a14d-423d-8504-c36f1efc187e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:18.730423 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.730343 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-dshm" (OuterVolumeSpecName: "dshm") pod "9c5367a5-a14d-423d-8504-c36f1efc187e" (UID: "9c5367a5-a14d-423d-8504-c36f1efc187e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:18.795443 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.795380 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9c5367a5-a14d-423d-8504-c36f1efc187e" (UID: "9c5367a5-a14d-423d-8504-c36f1efc187e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:18.828809 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.828784 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:31:18.828809 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.828808 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:31:18.828967 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.828816 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:31:18.828967 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.828825 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5367a5-a14d-423d-8504-c36f1efc187e-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:31:18.828967 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.828835 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8hwnq\" (UniqueName: \"kubernetes.io/projected/9c5367a5-a14d-423d-8504-c36f1efc187e-kube-api-access-8hwnq\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:31:18.828967 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:18.828842 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9c5367a5-a14d-423d-8504-c36f1efc187e-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:31:19.108422 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.108331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-9bbdcd449-6rfdm_9c5367a5-a14d-423d-8504-c36f1efc187e/main/0.log" Apr 16 16:31:19.108862 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.108720 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerID="ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1" exitCode=137 Apr 16 16:31:19.108862 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.108803 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" Apr 16 16:31:19.108862 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.108802 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" event={"ID":"9c5367a5-a14d-423d-8504-c36f1efc187e","Type":"ContainerDied","Data":"ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1"} Apr 16 16:31:19.108862 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.108849 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm" event={"ID":"9c5367a5-a14d-423d-8504-c36f1efc187e","Type":"ContainerDied","Data":"cfacdf3f84a6685a7d40f8cfc4cc084626bba2b5bda7df256d9955c27803a714"} Apr 16 16:31:19.109017 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.108869 2577 scope.go:117] "RemoveContainer" containerID="ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1" Apr 16 16:31:19.127546 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.127523 2577 scope.go:117] "RemoveContainer" containerID="b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee" Apr 16 16:31:19.138052 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.138032 2577 scope.go:117] "RemoveContainer" containerID="ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1" Apr 16 16:31:19.138750 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:31:19.138705 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1\": container with ID starting with ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1 not found: ID does not exist" containerID="ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1" Apr 16 16:31:19.138859 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.138746 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1"} err="failed to get container status \"ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1\": rpc error: code = NotFound desc = could not find container \"ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1\": container with ID starting with ed8f8390f1c125d10806be458122e115d82301906e40c3d2c550f2f3397cbbc1 not found: ID does not exist" Apr 16 16:31:19.138859 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.138771 2577 scope.go:117] "RemoveContainer" containerID="b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee" Apr 16 16:31:19.139086 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:31:19.139033 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee\": container with ID starting with b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee not found: ID does not exist" containerID="b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee" Apr 16 16:31:19.139086 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.139069 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee"} err="failed to get container status \"b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee\": rpc error: code = NotFound desc = could not find container \"b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee\": container with ID starting with b900df2ce5309f184463a3a137c86e1c35fb378388d5d22e3829721f0005efee not found: ID does not exist" Apr 16 16:31:19.139279 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.139264 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm"] Apr 16 16:31:19.142407 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:19.142383 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-6rfdm"] Apr 16 16:31:20.550064 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:20.550017 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" path="/var/lib/kubelet/pods/9c5367a5-a14d-423d-8504-c36f1efc187e/volumes" Apr 16 16:31:22.703603 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:22.703570 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:22.703946 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:22.703616 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:31:22.705008 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:22.704976 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:31:32.703392 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:32.703342 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:31:33.154792 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:33.154759 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9"] Apr 16 16:31:33.155194 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:33.155140 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" containerID="cri-o://c4a5ae6cfbe68dcbd00d6bdef82a2cf6725e0ac391e03b6d02a6a0b52e56653e" gracePeriod=30 Apr 16 16:31:40.339104 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.339051 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs"] Apr 16 16:31:40.339609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.339539 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="storage-initializer" Apr 16 16:31:40.339609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.339556 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="storage-initializer" Apr 16 16:31:40.339609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.339586 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" Apr 16 16:31:40.339609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.339595 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" Apr 16 16:31:40.339778 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.339677 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c5367a5-a14d-423d-8504-c36f1efc187e" containerName="main" Apr 16 16:31:40.342944 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.342925 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.345844 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.345823 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 16:31:40.352221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.352194 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs"] Apr 16 16:31:40.407212 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.407172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.407212 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.407212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.407472 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.407239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.407472 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.407276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.407472 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.407323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52vhs\" (UniqueName: \"kubernetes.io/projected/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kube-api-access-52vhs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.407472 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.407396 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.508732 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.508695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.508894 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.508741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.508894 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.508794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52vhs\" (UniqueName: \"kubernetes.io/projected/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kube-api-access-52vhs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.508894 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.508828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.508894 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.508864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.509092 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.509004 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.509151 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.509099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.509212 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.509157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.509212 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.509198 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.511383 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.511353 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.511722 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.511703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.517199 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.517176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52vhs\" (UniqueName: \"kubernetes.io/projected/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kube-api-access-52vhs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.655017 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.654981 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:40.853028 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:31:40.852883 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2c1cd9_97f7_4b7c_bf38_787c24e91147.slice/crio-d6687806a05c029468e9857b4dc8ad4ce6136f085013c7d1a5d9a8d10b6cb382 WatchSource:0}: Error finding container d6687806a05c029468e9857b4dc8ad4ce6136f085013c7d1a5d9a8d10b6cb382: Status 404 returned error can't find the container with id d6687806a05c029468e9857b4dc8ad4ce6136f085013c7d1a5d9a8d10b6cb382 Apr 16 16:31:40.857840 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:40.857814 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs"] Apr 16 16:31:41.202923 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:41.202834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" event={"ID":"1f2c1cd9-97f7-4b7c-bf38-787c24e91147","Type":"ContainerStarted","Data":"85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb"} Apr 16 16:31:41.202923 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:41.202883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" event={"ID":"1f2c1cd9-97f7-4b7c-bf38-787c24e91147","Type":"ContainerStarted","Data":"d6687806a05c029468e9857b4dc8ad4ce6136f085013c7d1a5d9a8d10b6cb382"} Apr 16 16:31:42.703078 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:42.703029 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:31:45.217706 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:45.217668 2577 generic.go:358] "Generic (PLEG): container finished" podID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerID="85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb" exitCode=0 Apr 16 16:31:45.218101 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:45.217714 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" event={"ID":"1f2c1cd9-97f7-4b7c-bf38-787c24e91147","Type":"ContainerDied","Data":"85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb"} Apr 16 16:31:46.226952 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:46.226915 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" event={"ID":"1f2c1cd9-97f7-4b7c-bf38-787c24e91147","Type":"ContainerStarted","Data":"0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311"} Apr 16 16:31:46.249333 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:46.249289 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podStartSLOduration=6.249272905 podStartE2EDuration="6.249272905s" podCreationTimestamp="2026-04-16 16:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:31:46.247081394 +0000 UTC m=+1862.367739333" watchObservedRunningTime="2026-04-16 16:31:46.249272905 +0000 UTC m=+1862.369930846" Apr 16 16:31:50.655303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:50.655244 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:50.655761 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:50.655399 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:31:50.657154 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:50.657121 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:31:52.703752 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:31:52.703709 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:32:00.656630 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:00.656529 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:32:02.703280 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:02.703235 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:32:03.155991 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.155934 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="llm-d-routing-sidecar" containerID="cri-o://e05d2ee1280370ffda2ae9fdf8353f79174af78ae1f5b3acf7435c26c9950705" gracePeriod=2 Apr 16 16:32:03.284430 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.284399 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-dgnc9_84a0abf8-0351-49e3-b555-f123355bc770/main/0.log" Apr 16 16:32:03.285240 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.285213 2577 generic.go:358] "Generic (PLEG): container finished" podID="84a0abf8-0351-49e3-b555-f123355bc770" containerID="c4a5ae6cfbe68dcbd00d6bdef82a2cf6725e0ac391e03b6d02a6a0b52e56653e" exitCode=137 Apr 16 16:32:03.285240 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.285240 2577 generic.go:358] "Generic (PLEG): container finished" podID="84a0abf8-0351-49e3-b555-f123355bc770" containerID="e05d2ee1280370ffda2ae9fdf8353f79174af78ae1f5b3acf7435c26c9950705" exitCode=0 Apr 16 16:32:03.285372 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.285279 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" event={"ID":"84a0abf8-0351-49e3-b555-f123355bc770","Type":"ContainerDied","Data":"c4a5ae6cfbe68dcbd00d6bdef82a2cf6725e0ac391e03b6d02a6a0b52e56653e"} Apr 16 16:32:03.285372 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.285323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" event={"ID":"84a0abf8-0351-49e3-b555-f123355bc770","Type":"ContainerDied","Data":"e05d2ee1280370ffda2ae9fdf8353f79174af78ae1f5b3acf7435c26c9950705"} Apr 16 16:32:03.422519 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.422438 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-dgnc9_84a0abf8-0351-49e3-b555-f123355bc770/main/0.log" Apr 16 16:32:03.423320 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.423294 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:32:03.511348 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.511313 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-model-cache\") pod \"84a0abf8-0351-49e3-b555-f123355bc770\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " Apr 16 16:32:03.511544 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.511365 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-home\") pod \"84a0abf8-0351-49e3-b555-f123355bc770\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " Apr 16 16:32:03.511544 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.511397 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ttkj\" (UniqueName: \"kubernetes.io/projected/84a0abf8-0351-49e3-b555-f123355bc770-kube-api-access-6ttkj\") pod \"84a0abf8-0351-49e3-b555-f123355bc770\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " Apr 16 16:32:03.511544 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.511475 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a0abf8-0351-49e3-b555-f123355bc770-tls-certs\") pod \"84a0abf8-0351-49e3-b555-f123355bc770\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " Apr 16 16:32:03.511721 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.511614 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-dshm\") pod \"84a0abf8-0351-49e3-b555-f123355bc770\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " Apr 16 16:32:03.511721 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.511677 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-kserve-provision-location\") pod \"84a0abf8-0351-49e3-b555-f123355bc770\" (UID: \"84a0abf8-0351-49e3-b555-f123355bc770\") " Apr 16 16:32:03.511721 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.511693 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-model-cache" (OuterVolumeSpecName: "model-cache") pod "84a0abf8-0351-49e3-b555-f123355bc770" (UID: "84a0abf8-0351-49e3-b555-f123355bc770"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:03.511874 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.511821 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-home" (OuterVolumeSpecName: "home") pod "84a0abf8-0351-49e3-b555-f123355bc770" (UID: "84a0abf8-0351-49e3-b555-f123355bc770"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:03.512134 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.512076 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:32:03.512134 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.512105 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:32:03.513946 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.513907 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a0abf8-0351-49e3-b555-f123355bc770-kube-api-access-6ttkj" (OuterVolumeSpecName: "kube-api-access-6ttkj") pod "84a0abf8-0351-49e3-b555-f123355bc770" (UID: "84a0abf8-0351-49e3-b555-f123355bc770"). InnerVolumeSpecName "kube-api-access-6ttkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:32:03.514235 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.514212 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-dshm" (OuterVolumeSpecName: "dshm") pod "84a0abf8-0351-49e3-b555-f123355bc770" (UID: "84a0abf8-0351-49e3-b555-f123355bc770"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:03.514304 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.514271 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a0abf8-0351-49e3-b555-f123355bc770-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "84a0abf8-0351-49e3-b555-f123355bc770" (UID: "84a0abf8-0351-49e3-b555-f123355bc770"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:32:03.566165 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.566120 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84a0abf8-0351-49e3-b555-f123355bc770" (UID: "84a0abf8-0351-49e3-b555-f123355bc770"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:03.612817 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.612783 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a0abf8-0351-49e3-b555-f123355bc770-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:32:03.612963 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.612823 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:32:03.612963 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.612841 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a0abf8-0351-49e3-b555-f123355bc770-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:32:03.612963 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:03.612857 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ttkj\" (UniqueName: \"kubernetes.io/projected/84a0abf8-0351-49e3-b555-f123355bc770-kube-api-access-6ttkj\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:32:04.290077 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.290046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-dgnc9_84a0abf8-0351-49e3-b555-f123355bc770/main/0.log" Apr 16 16:32:04.290855 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.290826 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" event={"ID":"84a0abf8-0351-49e3-b555-f123355bc770","Type":"ContainerDied","Data":"ddff44c0d649751f7fbea2f4df8bc6ab8b6e8a142b57f9e3f2c5b61204448fcc"} Apr 16 16:32:04.290941 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.290867 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9" Apr 16 16:32:04.290941 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.290878 2577 scope.go:117] "RemoveContainer" containerID="c4a5ae6cfbe68dcbd00d6bdef82a2cf6725e0ac391e03b6d02a6a0b52e56653e" Apr 16 16:32:04.318347 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.318314 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9"] Apr 16 16:32:04.319153 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.319125 2577 scope.go:117] "RemoveContainer" containerID="eafa8c38632b0cc13a8b10027a1c34125ebd8dd9902fa59d48b98cdd8aa1e46b" Apr 16 16:32:04.321655 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.321616 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-dgnc9"] Apr 16 16:32:04.382694 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.382663 2577 scope.go:117] "RemoveContainer" containerID="e05d2ee1280370ffda2ae9fdf8353f79174af78ae1f5b3acf7435c26c9950705" Apr 16 16:32:04.551392 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:04.551303 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a0abf8-0351-49e3-b555-f123355bc770" path="/var/lib/kubelet/pods/84a0abf8-0351-49e3-b555-f123355bc770/volumes" Apr 16 16:32:10.656004 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:10.655954 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:32:12.703057 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:12.703004 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:32:20.656420 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:20.656374 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:32:22.703635 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:22.703582 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:32:30.655826 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:30.655788 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:32:32.703961 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:32.703915 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:32:40.655646 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:40.655607 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:32:42.703760 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:42.703710 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:32:50.655727 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:50.655676 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:32:52.702925 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:32:52.702886 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 16:33:00.656318 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:00.656276 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:33:02.713490 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:02.713430 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:33:02.721412 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:02.721386 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:33:10.656113 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:10.656065 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:33:15.686149 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:15.686105 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4"] Apr 16 16:33:15.686569 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:15.686488 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" containerID="cri-o://f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c" gracePeriod=30 Apr 16 16:33:20.656243 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:20.656201 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 16:33:21.624699 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.624660 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv"] Apr 16 16:33:21.625022 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.625001 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="llm-d-routing-sidecar" Apr 16 16:33:21.625022 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.625020 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="llm-d-routing-sidecar" Apr 16 16:33:21.625200 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.625035 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="storage-initializer" Apr 16 16:33:21.625200 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.625043 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="storage-initializer" Apr 16 16:33:21.625200 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.625062 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" Apr 16 16:33:21.625200 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.625070 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" Apr 16 16:33:21.625200 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.625165 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="main" Apr 16 16:33:21.625200 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.625179 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="84a0abf8-0351-49e3-b555-f123355bc770" containerName="llm-d-routing-sidecar" Apr 16 16:33:21.631099 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.631077 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.633928 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.633903 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 16:33:21.642527 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.642498 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv"] Apr 16 16:33:21.691202 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.691156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.691202 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.691190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.691784 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.691212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5920342b-5ad0-4a17-9a56-775cadc3c716-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.691784 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.691259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzjf\" (UniqueName: \"kubernetes.io/projected/5920342b-5ad0-4a17-9a56-775cadc3c716-kube-api-access-ltzjf\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.691784 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.691331 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.691784 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.691378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.792775 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.792735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.792986 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.792788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.792986 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.792840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.792986 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.792862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.792986 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.792898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5920342b-5ad0-4a17-9a56-775cadc3c716-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.792986 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.792926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzjf\" (UniqueName: \"kubernetes.io/projected/5920342b-5ad0-4a17-9a56-775cadc3c716-kube-api-access-ltzjf\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.793297 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.793271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.793374 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.793349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.793433 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.793373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.795396 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.795376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.795805 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.795781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5920342b-5ad0-4a17-9a56-775cadc3c716-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.800794 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.800772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzjf\" (UniqueName: \"kubernetes.io/projected/5920342b-5ad0-4a17-9a56-775cadc3c716-kube-api-access-ltzjf\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:21.943046 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:21.942955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:22.083513 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:22.083481 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv"] Apr 16 16:33:22.085560 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:33:22.085526 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5920342b_5ad0_4a17_9a56_775cadc3c716.slice/crio-78274221da8994ea4048f9ceee259c14effbe5f90feb23c33b332ac4264ce0e4 WatchSource:0}: Error finding container 78274221da8994ea4048f9ceee259c14effbe5f90feb23c33b332ac4264ce0e4: Status 404 returned error can't find the container with id 78274221da8994ea4048f9ceee259c14effbe5f90feb23c33b332ac4264ce0e4 Apr 16 16:33:22.565618 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:22.565588 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" event={"ID":"5920342b-5ad0-4a17-9a56-775cadc3c716","Type":"ContainerStarted","Data":"7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f"} Apr 16 16:33:22.565844 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:22.565627 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" event={"ID":"5920342b-5ad0-4a17-9a56-775cadc3c716","Type":"ContainerStarted","Data":"78274221da8994ea4048f9ceee259c14effbe5f90feb23c33b332ac4264ce0e4"} Apr 16 16:33:26.582580 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:26.582499 2577 generic.go:358] "Generic (PLEG): container finished" podID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerID="7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f" exitCode=0 Apr 16 16:33:26.582580 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:26.582549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" event={"ID":"5920342b-5ad0-4a17-9a56-775cadc3c716","Type":"ContainerDied","Data":"7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f"} Apr 16 16:33:27.587677 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:27.587634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" event={"ID":"5920342b-5ad0-4a17-9a56-775cadc3c716","Type":"ContainerStarted","Data":"e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951"} Apr 16 16:33:27.609273 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:27.609211 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podStartSLOduration=6.609193068 podStartE2EDuration="6.609193068s" podCreationTimestamp="2026-04-16 16:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:33:27.607197072 +0000 UTC m=+1963.727855017" watchObservedRunningTime="2026-04-16 16:33:27.609193068 +0000 UTC m=+1963.729851014" Apr 16 16:33:30.665813 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:30.665725 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:33:30.673801 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:30.673771 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:33:31.943710 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:31.943672 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:31.943710 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:31.943717 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:33:31.944909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:31.944884 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:33:41.926413 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:41.926377 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs"] Apr 16 16:33:41.926987 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:41.926760 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" containerID="cri-o://0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311" gracePeriod=30 Apr 16 16:33:41.944178 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:41.944145 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:33:45.947402 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:45.947338 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:33:46.008029 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.007999 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f1951-9d17-45a7-8075-e427281ab2c3-tls-certs\") pod \"e52f1951-9d17-45a7-8075-e427281ab2c3\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " Apr 16 16:33:46.008029 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.008039 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-model-cache\") pod \"e52f1951-9d17-45a7-8075-e427281ab2c3\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " Apr 16 16:33:46.008287 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.008078 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-home\") pod \"e52f1951-9d17-45a7-8075-e427281ab2c3\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " Apr 16 16:33:46.008287 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.008106 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-kserve-provision-location\") pod \"e52f1951-9d17-45a7-8075-e427281ab2c3\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " Apr 16 16:33:46.008287 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.008161 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-dshm\") pod \"e52f1951-9d17-45a7-8075-e427281ab2c3\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " Apr 16 16:33:46.008287 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.008184 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45zld\" (UniqueName: \"kubernetes.io/projected/e52f1951-9d17-45a7-8075-e427281ab2c3-kube-api-access-45zld\") pod \"e52f1951-9d17-45a7-8075-e427281ab2c3\" (UID: \"e52f1951-9d17-45a7-8075-e427281ab2c3\") " Apr 16 16:33:46.008521 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.008343 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-model-cache" (OuterVolumeSpecName: "model-cache") pod "e52f1951-9d17-45a7-8075-e427281ab2c3" (UID: "e52f1951-9d17-45a7-8075-e427281ab2c3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:33:46.008521 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.008503 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-home" (OuterVolumeSpecName: "home") pod "e52f1951-9d17-45a7-8075-e427281ab2c3" (UID: "e52f1951-9d17-45a7-8075-e427281ab2c3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:33:46.010438 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.010405 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52f1951-9d17-45a7-8075-e427281ab2c3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e52f1951-9d17-45a7-8075-e427281ab2c3" (UID: "e52f1951-9d17-45a7-8075-e427281ab2c3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:33:46.010581 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.010441 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-dshm" (OuterVolumeSpecName: "dshm") pod "e52f1951-9d17-45a7-8075-e427281ab2c3" (UID: "e52f1951-9d17-45a7-8075-e427281ab2c3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:33:46.010581 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.010487 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52f1951-9d17-45a7-8075-e427281ab2c3-kube-api-access-45zld" (OuterVolumeSpecName: "kube-api-access-45zld") pod "e52f1951-9d17-45a7-8075-e427281ab2c3" (UID: "e52f1951-9d17-45a7-8075-e427281ab2c3"). InnerVolumeSpecName "kube-api-access-45zld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:33:46.075547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.075509 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e52f1951-9d17-45a7-8075-e427281ab2c3" (UID: "e52f1951-9d17-45a7-8075-e427281ab2c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:33:46.108770 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.108744 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:33:46.108770 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.108770 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-45zld\" (UniqueName: \"kubernetes.io/projected/e52f1951-9d17-45a7-8075-e427281ab2c3-kube-api-access-45zld\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:33:46.108956 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.108780 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f1951-9d17-45a7-8075-e427281ab2c3-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:33:46.108956 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.108791 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:33:46.108956 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.108800 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:33:46.108956 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.108809 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52f1951-9d17-45a7-8075-e427281ab2c3-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:33:46.654098 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.654054 2577 generic.go:358] "Generic (PLEG): container finished" podID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerID="f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c" exitCode=137 Apr 16 16:33:46.654273 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.654136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" event={"ID":"e52f1951-9d17-45a7-8075-e427281ab2c3","Type":"ContainerDied","Data":"f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c"} Apr 16 16:33:46.654273 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.654183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" event={"ID":"e52f1951-9d17-45a7-8075-e427281ab2c3","Type":"ContainerDied","Data":"7402263911b0eab2f381b2f0511eed7763537cd968a6549d87dcf19849c32d68"} Apr 16 16:33:46.654273 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.654199 2577 scope.go:117] "RemoveContainer" containerID="f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c" Apr 16 16:33:46.654273 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.654143 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4" Apr 16 16:33:46.674572 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.674548 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4"] Apr 16 16:33:46.680358 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.680327 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6flr8m4"] Apr 16 16:33:46.682433 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.682412 2577 scope.go:117] "RemoveContainer" containerID="4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914" Apr 16 16:33:46.753173 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.753001 2577 scope.go:117] "RemoveContainer" containerID="f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c" Apr 16 16:33:46.753385 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:33:46.753361 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c\": container with ID starting with f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c not found: ID does not exist" containerID="f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c" Apr 16 16:33:46.753443 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.753396 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c"} err="failed to get container status \"f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c\": rpc error: code = NotFound desc = could not find container \"f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c\": container with ID starting with f2b9596d64c2da17229f7286b6c0ddb372b6e7bb5a21d6fad998569c80a6ca8c not found: ID does not exist" Apr 16 16:33:46.753443 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.753414 2577 scope.go:117] "RemoveContainer" containerID="4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914" Apr 16 16:33:46.753719 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:33:46.753696 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914\": container with ID starting with 4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914 not found: ID does not exist" containerID="4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914" Apr 16 16:33:46.753769 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:46.753728 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914"} err="failed to get container status \"4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914\": rpc error: code = NotFound desc = could not find container \"4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914\": container with ID starting with 4aadb280e29ae4f1e50acda8fd192ccd6ef9af156c7c967ff1b3b06170282914 not found: ID does not exist" Apr 16 16:33:48.550249 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:48.550219 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" path="/var/lib/kubelet/pods/e52f1951-9d17-45a7-8075-e427281ab2c3/volumes" Apr 16 16:33:51.944272 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:33:51.944223 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:34:00.960837 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.960742 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:34:00.961330 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.961309 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="storage-initializer" Apr 16 16:34:00.961378 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.961337 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="storage-initializer" Apr 16 16:34:00.961378 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.961359 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" Apr 16 16:34:00.961378 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.961367 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" Apr 16 16:34:00.961508 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.961484 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e52f1951-9d17-45a7-8075-e427281ab2c3" containerName="main" Apr 16 16:34:00.967936 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.967916 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:00.971933 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.971908 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-7c864\"" Apr 16 16:34:00.972972 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.972952 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 16:34:00.986078 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:00.986052 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:34:01.021152 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.021119 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.021303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.021168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.021303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.021187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d749debd-42b8-4fe4-9449-33ea65c92f98-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.021303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.021209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.021303 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.021252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdzqz\" (UniqueName: \"kubernetes.io/projected/d749debd-42b8-4fe4-9449-33ea65c92f98-kube-api-access-tdzqz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.021489 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.021330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122038 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d749debd-42b8-4fe4-9449-33ea65c92f98-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122221 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdzqz\" (UniqueName: \"kubernetes.io/projected/d749debd-42b8-4fe4-9449-33ea65c92f98-kube-api-access-tdzqz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122590 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122590 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.122780 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.122640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.124622 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.124590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.124782 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.124760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d749debd-42b8-4fe4-9449-33ea65c92f98-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.130066 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.130046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdzqz\" (UniqueName: \"kubernetes.io/projected/d749debd-42b8-4fe4-9449-33ea65c92f98-kube-api-access-tdzqz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.278315 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.278217 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:34:01.406289 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.406265 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:34:01.409466 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:34:01.409407 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd749debd_42b8_4fe4_9449_33ea65c92f98.slice/crio-b61724c68dc35bf564192e5680beb96dfa8d27e30b11206f3291843bbda0cdd9 WatchSource:0}: Error finding container b61724c68dc35bf564192e5680beb96dfa8d27e30b11206f3291843bbda0cdd9: Status 404 returned error can't find the container with id b61724c68dc35bf564192e5680beb96dfa8d27e30b11206f3291843bbda0cdd9 Apr 16 16:34:01.411895 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.411876 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:34:01.707273 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.707241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d749debd-42b8-4fe4-9449-33ea65c92f98","Type":"ContainerStarted","Data":"67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c"} Apr 16 16:34:01.707273 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.707280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d749debd-42b8-4fe4-9449-33ea65c92f98","Type":"ContainerStarted","Data":"b61724c68dc35bf564192e5680beb96dfa8d27e30b11206f3291843bbda0cdd9"} Apr 16 16:34:01.944116 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:01.944066 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:34:06.725362 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:06.725327 2577 generic.go:358] "Generic (PLEG): container finished" podID="d749debd-42b8-4fe4-9449-33ea65c92f98" containerID="67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c" exitCode=0 Apr 16 16:34:06.725774 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:06.725378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d749debd-42b8-4fe4-9449-33ea65c92f98","Type":"ContainerDied","Data":"67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c"} Apr 16 16:34:07.730484 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:07.730436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d749debd-42b8-4fe4-9449-33ea65c92f98","Type":"ContainerStarted","Data":"ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795"} Apr 16 16:34:07.752085 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:07.752032 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=7.752017195 podStartE2EDuration="7.752017195s" podCreationTimestamp="2026-04-16 16:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:34:07.750463481 +0000 UTC m=+2003.871121418" watchObservedRunningTime="2026-04-16 16:34:07.752017195 +0000 UTC m=+2003.872675133" Apr 16 16:34:11.943518 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:11.943466 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:34:12.218704 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.218653 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs_1f2c1cd9-97f7-4b7c-bf38-787c24e91147/main/0.log" Apr 16 16:34:12.219128 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.219111 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:34:12.325265 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.325228 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52vhs\" (UniqueName: \"kubernetes.io/projected/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kube-api-access-52vhs\") pod \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " Apr 16 16:34:12.325480 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.325292 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kserve-provision-location\") pod \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " Apr 16 16:34:12.325480 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.325314 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-dshm\") pod \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " Apr 16 16:34:12.325480 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.325377 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-model-cache\") pod \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " Apr 16 16:34:12.325480 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.325398 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-home\") pod \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " Apr 16 16:34:12.325480 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.325427 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-tls-certs\") pod \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\" (UID: \"1f2c1cd9-97f7-4b7c-bf38-787c24e91147\") " Apr 16 16:34:12.325752 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.325647 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-model-cache" (OuterVolumeSpecName: "model-cache") pod "1f2c1cd9-97f7-4b7c-bf38-787c24e91147" (UID: "1f2c1cd9-97f7-4b7c-bf38-787c24e91147"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:12.326132 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.326004 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-home" (OuterVolumeSpecName: "home") pod "1f2c1cd9-97f7-4b7c-bf38-787c24e91147" (UID: "1f2c1cd9-97f7-4b7c-bf38-787c24e91147"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:12.327789 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.327753 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-dshm" (OuterVolumeSpecName: "dshm") pod "1f2c1cd9-97f7-4b7c-bf38-787c24e91147" (UID: "1f2c1cd9-97f7-4b7c-bf38-787c24e91147"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:12.328256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.328222 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1f2c1cd9-97f7-4b7c-bf38-787c24e91147" (UID: "1f2c1cd9-97f7-4b7c-bf38-787c24e91147"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:34:12.328332 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.328289 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kube-api-access-52vhs" (OuterVolumeSpecName: "kube-api-access-52vhs") pod "1f2c1cd9-97f7-4b7c-bf38-787c24e91147" (UID: "1f2c1cd9-97f7-4b7c-bf38-787c24e91147"). InnerVolumeSpecName "kube-api-access-52vhs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:34:12.398390 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.398333 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1f2c1cd9-97f7-4b7c-bf38-787c24e91147" (UID: "1f2c1cd9-97f7-4b7c-bf38-787c24e91147"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:12.426917 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.426883 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:34:12.426917 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.426917 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:34:12.427135 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.426932 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:34:12.427135 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.426945 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:34:12.427135 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.426959 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:34:12.427135 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.426973 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52vhs\" (UniqueName: \"kubernetes.io/projected/1f2c1cd9-97f7-4b7c-bf38-787c24e91147-kube-api-access-52vhs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:34:12.751217 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.751188 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs_1f2c1cd9-97f7-4b7c-bf38-787c24e91147/main/0.log" Apr 16 16:34:12.751576 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.751545 2577 generic.go:358] "Generic (PLEG): container finished" podID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerID="0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311" exitCode=137 Apr 16 16:34:12.751710 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.751608 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" event={"ID":"1f2c1cd9-97f7-4b7c-bf38-787c24e91147","Type":"ContainerDied","Data":"0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311"} Apr 16 16:34:12.751710 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.751636 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" Apr 16 16:34:12.751710 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.751649 2577 scope.go:117] "RemoveContainer" containerID="0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311" Apr 16 16:34:12.751845 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.751637 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs" event={"ID":"1f2c1cd9-97f7-4b7c-bf38-787c24e91147","Type":"ContainerDied","Data":"d6687806a05c029468e9857b4dc8ad4ce6136f085013c7d1a5d9a8d10b6cb382"} Apr 16 16:34:12.773820 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.773741 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs"] Apr 16 16:34:12.775547 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.775171 2577 scope.go:117] "RemoveContainer" containerID="85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb" Apr 16 16:34:12.776286 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.776261 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcbq4gs"] Apr 16 16:34:12.850256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.850234 2577 scope.go:117] "RemoveContainer" containerID="0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311" Apr 16 16:34:12.850665 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:34:12.850639 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311\": container with ID starting with 0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311 not found: ID does not exist" containerID="0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311" Apr 16 16:34:12.850771 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.850679 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311"} err="failed to get container status \"0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311\": rpc error: code = NotFound desc = could not find container \"0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311\": container with ID starting with 0c2c8658c5c58d380a1ff408020cedda63a418f975ffcedbfb25f53961a20311 not found: ID does not exist" Apr 16 16:34:12.850771 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.850700 2577 scope.go:117] "RemoveContainer" containerID="85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb" Apr 16 16:34:12.851001 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:34:12.850986 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb\": container with ID starting with 85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb not found: ID does not exist" containerID="85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb" Apr 16 16:34:12.851059 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:12.851009 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb"} err="failed to get container status \"85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb\": rpc error: code = NotFound desc = could not find container \"85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb\": container with ID starting with 85e7ee9d70eb52882a632d46af72d78e019e65f0c7b2c78cdb5420445707a5fb not found: ID does not exist" Apr 16 16:34:14.550569 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:14.550536 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" path="/var/lib/kubelet/pods/1f2c1cd9-97f7-4b7c-bf38-787c24e91147/volumes" Apr 16 16:34:21.943874 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:21.943829 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:34:31.943657 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:31.943614 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:34:41.943841 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:41.943788 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:34:51.943540 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:34:51.943503 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 16:35:01.952915 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:01.952839 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:35:01.961207 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:01.961179 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:35:19.190625 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:19.190591 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv"] Apr 16 16:35:19.191094 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:19.190890 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" containerID="cri-o://e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951" gracePeriod=30 Apr 16 16:35:44.605027 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:44.605000 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:35:44.608242 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:44.608218 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:35:49.427544 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.427520 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:35:49.581344 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.581251 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltzjf\" (UniqueName: \"kubernetes.io/projected/5920342b-5ad0-4a17-9a56-775cadc3c716-kube-api-access-ltzjf\") pod \"5920342b-5ad0-4a17-9a56-775cadc3c716\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " Apr 16 16:35:49.581344 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.581325 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-home\") pod \"5920342b-5ad0-4a17-9a56-775cadc3c716\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " Apr 16 16:35:49.581609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.581350 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-kserve-provision-location\") pod \"5920342b-5ad0-4a17-9a56-775cadc3c716\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " Apr 16 16:35:49.581609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.581381 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5920342b-5ad0-4a17-9a56-775cadc3c716-tls-certs\") pod \"5920342b-5ad0-4a17-9a56-775cadc3c716\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " Apr 16 16:35:49.581609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.581408 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-dshm\") pod \"5920342b-5ad0-4a17-9a56-775cadc3c716\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " Apr 16 16:35:49.581609 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.581475 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-model-cache\") pod \"5920342b-5ad0-4a17-9a56-775cadc3c716\" (UID: \"5920342b-5ad0-4a17-9a56-775cadc3c716\") " Apr 16 16:35:49.581851 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.581819 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-model-cache" (OuterVolumeSpecName: "model-cache") pod "5920342b-5ad0-4a17-9a56-775cadc3c716" (UID: "5920342b-5ad0-4a17-9a56-775cadc3c716"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:35:49.581968 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.581849 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-home" (OuterVolumeSpecName: "home") pod "5920342b-5ad0-4a17-9a56-775cadc3c716" (UID: "5920342b-5ad0-4a17-9a56-775cadc3c716"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:35:49.583661 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.583637 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5920342b-5ad0-4a17-9a56-775cadc3c716-kube-api-access-ltzjf" (OuterVolumeSpecName: "kube-api-access-ltzjf") pod "5920342b-5ad0-4a17-9a56-775cadc3c716" (UID: "5920342b-5ad0-4a17-9a56-775cadc3c716"). InnerVolumeSpecName "kube-api-access-ltzjf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:35:49.584261 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.584237 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-dshm" (OuterVolumeSpecName: "dshm") pod "5920342b-5ad0-4a17-9a56-775cadc3c716" (UID: "5920342b-5ad0-4a17-9a56-775cadc3c716"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:35:49.584261 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.584243 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5920342b-5ad0-4a17-9a56-775cadc3c716-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5920342b-5ad0-4a17-9a56-775cadc3c716" (UID: "5920342b-5ad0-4a17-9a56-775cadc3c716"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:35:49.645138 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.645099 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5920342b-5ad0-4a17-9a56-775cadc3c716" (UID: "5920342b-5ad0-4a17-9a56-775cadc3c716"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:35:49.683023 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.682987 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:35:49.683023 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.683025 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltzjf\" (UniqueName: \"kubernetes.io/projected/5920342b-5ad0-4a17-9a56-775cadc3c716-kube-api-access-ltzjf\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:35:49.683256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.683038 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:35:49.683256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.683051 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:35:49.683256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.683065 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5920342b-5ad0-4a17-9a56-775cadc3c716-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:35:49.683256 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:49.683077 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5920342b-5ad0-4a17-9a56-775cadc3c716-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:35:50.096593 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.096551 2577 generic.go:358] "Generic (PLEG): container finished" podID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerID="e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951" exitCode=137 Apr 16 16:35:50.096888 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.096634 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" Apr 16 16:35:50.096888 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.096632 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" event={"ID":"5920342b-5ad0-4a17-9a56-775cadc3c716","Type":"ContainerDied","Data":"e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951"} Apr 16 16:35:50.096888 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.096746 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv" event={"ID":"5920342b-5ad0-4a17-9a56-775cadc3c716","Type":"ContainerDied","Data":"78274221da8994ea4048f9ceee259c14effbe5f90feb23c33b332ac4264ce0e4"} Apr 16 16:35:50.096888 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.096762 2577 scope.go:117] "RemoveContainer" containerID="e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951" Apr 16 16:35:50.118141 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.118089 2577 scope.go:117] "RemoveContainer" containerID="7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f" Apr 16 16:35:50.119567 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.119547 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv"] Apr 16 16:35:50.125250 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.125226 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-g8zwv"] Apr 16 16:35:50.179514 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.179490 2577 scope.go:117] "RemoveContainer" containerID="e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951" Apr 16 16:35:50.179812 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:35:50.179794 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951\": container with ID starting with e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951 not found: ID does not exist" containerID="e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951" Apr 16 16:35:50.179865 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.179820 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951"} err="failed to get container status \"e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951\": rpc error: code = NotFound desc = could not find container \"e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951\": container with ID starting with e4d38ad0647da0caf2e303adbecf027aef033b8094a483096fbb147321feb951 not found: ID does not exist" Apr 16 16:35:50.179865 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.179844 2577 scope.go:117] "RemoveContainer" containerID="7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f" Apr 16 16:35:50.180084 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:35:50.180068 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f\": container with ID starting with 7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f not found: ID does not exist" containerID="7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f" Apr 16 16:35:50.180125 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.180087 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f"} err="failed to get container status \"7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f\": rpc error: code = NotFound desc = could not find container \"7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f\": container with ID starting with 7fb75d994d8c0001c01b91e1f1837e742898f1085c9823ae88b1230eae06683f not found: ID does not exist" Apr 16 16:35:50.549133 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:35:50.549099 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" path="/var/lib/kubelet/pods/5920342b-5ad0-4a17-9a56-775cadc3c716/volumes" Apr 16 16:36:02.602924 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:02.602886 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:36:02.603349 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:02.603256 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="d749debd-42b8-4fe4-9449-33ea65c92f98" containerName="main" containerID="cri-o://ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795" gracePeriod=30 Apr 16 16:36:03.358964 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.358939 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:36:03.492696 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.492604 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-home\") pod \"d749debd-42b8-4fe4-9449-33ea65c92f98\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " Apr 16 16:36:03.492696 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.492645 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-kserve-provision-location\") pod \"d749debd-42b8-4fe4-9449-33ea65c92f98\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " Apr 16 16:36:03.492696 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.492670 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-dshm\") pod \"d749debd-42b8-4fe4-9449-33ea65c92f98\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " Apr 16 16:36:03.492952 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.492707 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-model-cache\") pod \"d749debd-42b8-4fe4-9449-33ea65c92f98\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " Apr 16 16:36:03.492952 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.492731 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d749debd-42b8-4fe4-9449-33ea65c92f98-tls-certs\") pod \"d749debd-42b8-4fe4-9449-33ea65c92f98\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " Apr 16 16:36:03.492952 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.492758 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdzqz\" (UniqueName: \"kubernetes.io/projected/d749debd-42b8-4fe4-9449-33ea65c92f98-kube-api-access-tdzqz\") pod \"d749debd-42b8-4fe4-9449-33ea65c92f98\" (UID: \"d749debd-42b8-4fe4-9449-33ea65c92f98\") " Apr 16 16:36:03.493161 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.492970 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-model-cache" (OuterVolumeSpecName: "model-cache") pod "d749debd-42b8-4fe4-9449-33ea65c92f98" (UID: "d749debd-42b8-4fe4-9449-33ea65c92f98"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:03.493161 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.493082 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-home" (OuterVolumeSpecName: "home") pod "d749debd-42b8-4fe4-9449-33ea65c92f98" (UID: "d749debd-42b8-4fe4-9449-33ea65c92f98"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:03.495082 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.495052 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d749debd-42b8-4fe4-9449-33ea65c92f98-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d749debd-42b8-4fe4-9449-33ea65c92f98" (UID: "d749debd-42b8-4fe4-9449-33ea65c92f98"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:36:03.495195 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.495140 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d749debd-42b8-4fe4-9449-33ea65c92f98-kube-api-access-tdzqz" (OuterVolumeSpecName: "kube-api-access-tdzqz") pod "d749debd-42b8-4fe4-9449-33ea65c92f98" (UID: "d749debd-42b8-4fe4-9449-33ea65c92f98"). InnerVolumeSpecName "kube-api-access-tdzqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:36:03.495267 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.495257 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-dshm" (OuterVolumeSpecName: "dshm") pod "d749debd-42b8-4fe4-9449-33ea65c92f98" (UID: "d749debd-42b8-4fe4-9449-33ea65c92f98"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:03.556190 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.556152 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d749debd-42b8-4fe4-9449-33ea65c92f98" (UID: "d749debd-42b8-4fe4-9449-33ea65c92f98"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:03.594356 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.594326 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-model-cache\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:36:03.594356 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.594357 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d749debd-42b8-4fe4-9449-33ea65c92f98-tls-certs\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:36:03.594517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.594367 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdzqz\" (UniqueName: \"kubernetes.io/projected/d749debd-42b8-4fe4-9449-33ea65c92f98-kube-api-access-tdzqz\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:36:03.594517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.594376 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-home\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:36:03.594517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.594385 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-kserve-provision-location\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:36:03.594517 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:03.594394 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d749debd-42b8-4fe4-9449-33ea65c92f98-dshm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:36:04.144095 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.144055 2577 generic.go:358] "Generic (PLEG): container finished" podID="d749debd-42b8-4fe4-9449-33ea65c92f98" containerID="ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795" exitCode=0 Apr 16 16:36:04.144518 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.144110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d749debd-42b8-4fe4-9449-33ea65c92f98","Type":"ContainerDied","Data":"ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795"} Apr 16 16:36:04.144518 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.144128 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:36:04.144518 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.144144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d749debd-42b8-4fe4-9449-33ea65c92f98","Type":"ContainerDied","Data":"b61724c68dc35bf564192e5680beb96dfa8d27e30b11206f3291843bbda0cdd9"} Apr 16 16:36:04.144518 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.144163 2577 scope.go:117] "RemoveContainer" containerID="ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795" Apr 16 16:36:04.163102 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.163082 2577 scope.go:117] "RemoveContainer" containerID="67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c" Apr 16 16:36:04.170825 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.170805 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:36:04.173728 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.173706 2577 scope.go:117] "RemoveContainer" containerID="ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795" Apr 16 16:36:04.173902 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.173880 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:36:04.174023 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:36:04.174005 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795\": container with ID starting with ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795 not found: ID does not exist" containerID="ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795" Apr 16 16:36:04.174068 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.174031 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795"} err="failed to get container status \"ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795\": rpc error: code = NotFound desc = could not find container \"ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795\": container with ID starting with ea830b4c7bb7e450fa97b457f4fbc4e2e6435e16869ac5a9f2001d298089b795 not found: ID does not exist" Apr 16 16:36:04.174068 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.174048 2577 scope.go:117] "RemoveContainer" containerID="67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c" Apr 16 16:36:04.174276 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:36:04.174260 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c\": container with ID starting with 67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c not found: ID does not exist" containerID="67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c" Apr 16 16:36:04.174319 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.174280 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c"} err="failed to get container status \"67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c\": rpc error: code = NotFound desc = could not find container \"67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c\": container with ID starting with 67124ef4bd89411836b1bccb412f30a913f5381315b481342703df87170e719c not found: ID does not exist" Apr 16 16:36:04.550082 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:04.550007 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d749debd-42b8-4fe4-9449-33ea65c92f98" path="/var/lib/kubelet/pods/d749debd-42b8-4fe4-9449-33ea65c92f98/volumes" Apr 16 16:36:09.775563 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775532 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nw7cl/must-gather-9xp46"] Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775823 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d749debd-42b8-4fe4-9449-33ea65c92f98" containerName="main" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775834 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d749debd-42b8-4fe4-9449-33ea65c92f98" containerName="main" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775843 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775848 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775859 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="storage-initializer" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775865 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="storage-initializer" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775875 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="storage-initializer" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775880 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="storage-initializer" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775885 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d749debd-42b8-4fe4-9449-33ea65c92f98" containerName="storage-initializer" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775890 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d749debd-42b8-4fe4-9449-33ea65c92f98" containerName="storage-initializer" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775897 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" Apr 16 16:36:09.775909 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775902 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" Apr 16 16:36:09.776257 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775948 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5920342b-5ad0-4a17-9a56-775cadc3c716" containerName="main" Apr 16 16:36:09.776257 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775958 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f2c1cd9-97f7-4b7c-bf38-787c24e91147" containerName="main" Apr 16 16:36:09.776257 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.775964 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d749debd-42b8-4fe4-9449-33ea65c92f98" containerName="main" Apr 16 16:36:09.781197 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.781180 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:09.784001 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.783970 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nw7cl\"/\"kube-root-ca.crt\"" Apr 16 16:36:09.785292 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.785239 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nw7cl\"/\"default-dockercfg-4c4l6\"" Apr 16 16:36:09.785292 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.785239 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nw7cl\"/\"openshift-service-ca.crt\"" Apr 16 16:36:09.787735 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.787715 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nw7cl/must-gather-9xp46"] Apr 16 16:36:09.945726 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.945687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmdgm\" (UniqueName: \"kubernetes.io/projected/076d879d-f202-44b2-a318-1d7205743f01-kube-api-access-fmdgm\") pod \"must-gather-9xp46\" (UID: \"076d879d-f202-44b2-a318-1d7205743f01\") " pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:09.945902 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:09.945760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/076d879d-f202-44b2-a318-1d7205743f01-must-gather-output\") pod \"must-gather-9xp46\" (UID: \"076d879d-f202-44b2-a318-1d7205743f01\") " pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:10.046412 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:10.046332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmdgm\" (UniqueName: \"kubernetes.io/projected/076d879d-f202-44b2-a318-1d7205743f01-kube-api-access-fmdgm\") pod \"must-gather-9xp46\" (UID: \"076d879d-f202-44b2-a318-1d7205743f01\") " pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:10.046412 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:10.046394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/076d879d-f202-44b2-a318-1d7205743f01-must-gather-output\") pod \"must-gather-9xp46\" (UID: \"076d879d-f202-44b2-a318-1d7205743f01\") " pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:10.046742 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:10.046721 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/076d879d-f202-44b2-a318-1d7205743f01-must-gather-output\") pod \"must-gather-9xp46\" (UID: \"076d879d-f202-44b2-a318-1d7205743f01\") " pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:10.055087 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:10.055055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmdgm\" (UniqueName: \"kubernetes.io/projected/076d879d-f202-44b2-a318-1d7205743f01-kube-api-access-fmdgm\") pod \"must-gather-9xp46\" (UID: \"076d879d-f202-44b2-a318-1d7205743f01\") " pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:10.090682 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:10.090647 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:10.220159 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:10.220112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nw7cl/must-gather-9xp46"] Apr 16 16:36:10.222409 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:36:10.222385 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod076d879d_f202_44b2_a318_1d7205743f01.slice/crio-faa6da350a173bbdbfab3c2b2391178b76b104f46037aaedaad09860a8c68f46 WatchSource:0}: Error finding container faa6da350a173bbdbfab3c2b2391178b76b104f46037aaedaad09860a8c68f46: Status 404 returned error can't find the container with id faa6da350a173bbdbfab3c2b2391178b76b104f46037aaedaad09860a8c68f46 Apr 16 16:36:11.169040 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:11.168997 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nw7cl/must-gather-9xp46" event={"ID":"076d879d-f202-44b2-a318-1d7205743f01","Type":"ContainerStarted","Data":"faa6da350a173bbdbfab3c2b2391178b76b104f46037aaedaad09860a8c68f46"} Apr 16 16:36:15.183519 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:15.183481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nw7cl/must-gather-9xp46" event={"ID":"076d879d-f202-44b2-a318-1d7205743f01","Type":"ContainerStarted","Data":"1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777"} Apr 16 16:36:15.183986 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:15.183527 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nw7cl/must-gather-9xp46" event={"ID":"076d879d-f202-44b2-a318-1d7205743f01","Type":"ContainerStarted","Data":"63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1"} Apr 16 16:36:15.202045 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:15.201984 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nw7cl/must-gather-9xp46" podStartSLOduration=1.840939211 podStartE2EDuration="6.201966872s" podCreationTimestamp="2026-04-16 16:36:09 +0000 UTC" firstStartedPulling="2026-04-16 16:36:10.224153395 +0000 UTC m=+2126.344811314" lastFinishedPulling="2026-04-16 16:36:14.585181055 +0000 UTC m=+2130.705838975" observedRunningTime="2026-04-16 16:36:15.200984046 +0000 UTC m=+2131.321641985" watchObservedRunningTime="2026-04-16 16:36:15.201966872 +0000 UTC m=+2131.322624812" Apr 16 16:36:36.942382 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:36.942304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76688fbdc6-w7hxk_16874dd3-186f-4feb-bc62-e0eae6ce6072/router/0.log" Apr 16 16:36:37.760879 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:37.760834 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76688fbdc6-w7hxk_16874dd3-186f-4feb-bc62-e0eae6ce6072/router/0.log" Apr 16 16:36:38.562864 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:38.562839 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-92v9v_c020fb01-b472-49c6-9d21-a85264d58aec/kuadrant-console-plugin/0.log" Apr 16 16:36:38.607326 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:38.607297 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-j5p5j_e542dd41-bcf6-4468-9e60-d9cbabe88534/limitador/0.log" Apr 16 16:36:40.272932 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:40.272901 2577 generic.go:358] "Generic (PLEG): container finished" podID="076d879d-f202-44b2-a318-1d7205743f01" containerID="63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1" exitCode=0 Apr 16 16:36:40.273338 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:40.272953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nw7cl/must-gather-9xp46" event={"ID":"076d879d-f202-44b2-a318-1d7205743f01","Type":"ContainerDied","Data":"63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1"} Apr 16 16:36:40.273338 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:40.273247 2577 scope.go:117] "RemoveContainer" containerID="63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1" Apr 16 16:36:40.307630 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:40.307577 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nw7cl_must-gather-9xp46_076d879d-f202-44b2-a318-1d7205743f01/gather/0.log" Apr 16 16:36:44.396626 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:44.396592 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hwwrn_0af32a28-5811-4383-8faf-b6f21a0e2e4e/global-pull-secret-syncer/0.log" Apr 16 16:36:44.519302 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:44.519271 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-v84j7_6b98433d-7f8e-4ded-9eb4-1696faebcffe/konnectivity-agent/0.log" Apr 16 16:36:44.584498 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:44.584468 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-78.ec2.internal_2e885656e91016de60540dee8de7f05c/haproxy/0.log" Apr 16 16:36:46.333374 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.333340 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nw7cl/must-gather-9xp46"] Apr 16 16:36:46.333834 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.333576 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-nw7cl/must-gather-9xp46" podUID="076d879d-f202-44b2-a318-1d7205743f01" containerName="copy" containerID="cri-o://1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777" gracePeriod=2 Apr 16 16:36:46.338427 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.338401 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nw7cl/must-gather-9xp46"] Apr 16 16:36:46.562778 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.562756 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nw7cl_must-gather-9xp46_076d879d-f202-44b2-a318-1d7205743f01/copy/0.log" Apr 16 16:36:46.563100 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.563081 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:46.670540 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.670514 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/076d879d-f202-44b2-a318-1d7205743f01-must-gather-output\") pod \"076d879d-f202-44b2-a318-1d7205743f01\" (UID: \"076d879d-f202-44b2-a318-1d7205743f01\") " Apr 16 16:36:46.670672 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.670565 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmdgm\" (UniqueName: \"kubernetes.io/projected/076d879d-f202-44b2-a318-1d7205743f01-kube-api-access-fmdgm\") pod \"076d879d-f202-44b2-a318-1d7205743f01\" (UID: \"076d879d-f202-44b2-a318-1d7205743f01\") " Apr 16 16:36:46.672902 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.672872 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076d879d-f202-44b2-a318-1d7205743f01-kube-api-access-fmdgm" (OuterVolumeSpecName: "kube-api-access-fmdgm") pod "076d879d-f202-44b2-a318-1d7205743f01" (UID: "076d879d-f202-44b2-a318-1d7205743f01"). InnerVolumeSpecName "kube-api-access-fmdgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:36:46.675787 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.675765 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076d879d-f202-44b2-a318-1d7205743f01-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "076d879d-f202-44b2-a318-1d7205743f01" (UID: "076d879d-f202-44b2-a318-1d7205743f01"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:46.771361 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.771332 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/076d879d-f202-44b2-a318-1d7205743f01-must-gather-output\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:36:46.771361 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:46.771359 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmdgm\" (UniqueName: \"kubernetes.io/projected/076d879d-f202-44b2-a318-1d7205743f01-kube-api-access-fmdgm\") on node \"ip-10-0-140-78.ec2.internal\" DevicePath \"\"" Apr 16 16:36:47.295233 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.295197 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nw7cl_must-gather-9xp46_076d879d-f202-44b2-a318-1d7205743f01/copy/0.log" Apr 16 16:36:47.295562 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.295540 2577 generic.go:358] "Generic (PLEG): container finished" podID="076d879d-f202-44b2-a318-1d7205743f01" containerID="1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777" exitCode=143 Apr 16 16:36:47.295621 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.295606 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nw7cl/must-gather-9xp46" Apr 16 16:36:47.295662 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.295631 2577 scope.go:117] "RemoveContainer" containerID="1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777" Apr 16 16:36:47.304128 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.304101 2577 scope.go:117] "RemoveContainer" containerID="63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1" Apr 16 16:36:47.315976 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.315959 2577 scope.go:117] "RemoveContainer" containerID="1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777" Apr 16 16:36:47.316216 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:36:47.316198 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777\": container with ID starting with 1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777 not found: ID does not exist" containerID="1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777" Apr 16 16:36:47.316279 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.316225 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777"} err="failed to get container status \"1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777\": rpc error: code = NotFound desc = could not find container \"1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777\": container with ID starting with 1694eda0de39d1f78947cf918bcb87882c1e3f62e84ce2cf088f6ed28a680777 not found: ID does not exist" Apr 16 16:36:47.316279 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.316243 2577 scope.go:117] "RemoveContainer" containerID="63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1" Apr 16 16:36:47.316497 ip-10-0-140-78 kubenswrapper[2577]: E0416 16:36:47.316479 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1\": container with ID starting with 63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1 not found: ID does not exist" containerID="63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1" Apr 16 16:36:47.316548 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:47.316504 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1"} err="failed to get container status \"63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1\": rpc error: code = NotFound desc = could not find container \"63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1\": container with ID starting with 63299e687feebcd1d34a3e2a121906949a3990487f087ebd8b283dca896601b1 not found: ID does not exist" Apr 16 16:36:48.513330 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:48.513296 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-92v9v_c020fb01-b472-49c6-9d21-a85264d58aec/kuadrant-console-plugin/0.log" Apr 16 16:36:48.549663 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:48.549637 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076d879d-f202-44b2-a318-1d7205743f01" path="/var/lib/kubelet/pods/076d879d-f202-44b2-a318-1d7205743f01/volumes" Apr 16 16:36:48.593238 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:48.593203 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-j5p5j_e542dd41-bcf6-4468-9e60-d9cbabe88534/limitador/0.log" Apr 16 16:36:49.774355 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:49.774328 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-6dhvc_5f55b121-017a-43b5-97b3-540ef3a398d6/cluster-monitoring-operator/0.log" Apr 16 16:36:50.094499 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:50.094402 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q585g_1f6b813b-ad30-4c0d-a05b-17adb2975c08/node-exporter/0.log" Apr 16 16:36:50.118860 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:50.118836 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q585g_1f6b813b-ad30-4c0d-a05b-17adb2975c08/kube-rbac-proxy/0.log" Apr 16 16:36:50.142543 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:50.142520 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q585g_1f6b813b-ad30-4c0d-a05b-17adb2975c08/init-textfile/0.log" Apr 16 16:36:51.905809 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:51.905778 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-6xbxt_643d0a94-27da-40d4-a8e6-22ee5f06edb9/networking-console-plugin/0.log" Apr 16 16:36:52.407965 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.407941 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6hkg6_c4e2a6f2-a469-4d6a-92f6-19e95df574de/console-operator/0.log" Apr 16 16:36:52.899113 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.899076 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r"] Apr 16 16:36:52.899528 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.899507 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="076d879d-f202-44b2-a318-1d7205743f01" containerName="gather" Apr 16 16:36:52.899647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.899531 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="076d879d-f202-44b2-a318-1d7205743f01" containerName="gather" Apr 16 16:36:52.899647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.899549 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="076d879d-f202-44b2-a318-1d7205743f01" containerName="copy" Apr 16 16:36:52.899647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.899557 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="076d879d-f202-44b2-a318-1d7205743f01" containerName="copy" Apr 16 16:36:52.899647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.899632 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="076d879d-f202-44b2-a318-1d7205743f01" containerName="gather" Apr 16 16:36:52.899647 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.899644 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="076d879d-f202-44b2-a318-1d7205743f01" containerName="copy" Apr 16 16:36:52.905229 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.905210 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:52.908110 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.908078 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xr7j8\"/\"kube-root-ca.crt\"" Apr 16 16:36:52.908486 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.908145 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xr7j8\"/\"openshift-service-ca.crt\"" Apr 16 16:36:52.908486 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.908083 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xr7j8\"/\"default-dockercfg-dtjgr\"" Apr 16 16:36:52.909226 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:52.909201 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r"] Apr 16 16:36:53.023117 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.023081 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-proc\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.023292 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.023127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-sys\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.023292 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.023163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ltsq\" (UniqueName: \"kubernetes.io/projected/f73a6c75-f8da-4ca1-9001-9da8908c8583-kube-api-access-8ltsq\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.023292 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.023189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-podres\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.023292 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.023219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-lib-modules\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124020 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.123991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-lib-modules\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124149 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.124043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-proc\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124149 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.124072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-sys\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124149 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.124095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ltsq\" (UniqueName: \"kubernetes.io/projected/f73a6c75-f8da-4ca1-9001-9da8908c8583-kube-api-access-8ltsq\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124149 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.124118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-podres\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124279 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.124163 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-lib-modules\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124279 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.124186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-sys\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124279 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.124201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-proc\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.124279 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.124228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f73a6c75-f8da-4ca1-9001-9da8908c8583-podres\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.132599 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.132569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ltsq\" (UniqueName: \"kubernetes.io/projected/f73a6c75-f8da-4ca1-9001-9da8908c8583-kube-api-access-8ltsq\") pod \"perf-node-gather-daemonset-4t29r\" (UID: \"f73a6c75-f8da-4ca1-9001-9da8908c8583\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.215847 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.215764 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:53.334947 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.334922 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r"] Apr 16 16:36:53.337621 ip-10-0-140-78 kubenswrapper[2577]: W0416 16:36:53.337582 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf73a6c75_f8da_4ca1_9001_9da8908c8583.slice/crio-2943d7f8db313cae44fe4f5852b036a7f64ef1369561a6a874603251b34f2d14 WatchSource:0}: Error finding container 2943d7f8db313cae44fe4f5852b036a7f64ef1369561a6a874603251b34f2d14: Status 404 returned error can't find the container with id 2943d7f8db313cae44fe4f5852b036a7f64ef1369561a6a874603251b34f2d14 Apr 16 16:36:53.358952 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:53.358933 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-bxn9d_ad8ff5ed-0304-45fd-87fb-809604b2cc69/volume-data-source-validator/0.log" Apr 16 16:36:54.158110 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:54.158065 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w97np_600ee1fa-9fa9-462c-88e4-75f0cb67d3e7/dns/0.log" Apr 16 16:36:54.175615 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:54.175588 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w97np_600ee1fa-9fa9-462c-88e4-75f0cb67d3e7/kube-rbac-proxy/0.log" Apr 16 16:36:54.238768 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:54.238741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rbq9r_3ef774fd-1deb-4665-96a2-b7fe3ffcc315/dns-node-resolver/0.log" Apr 16 16:36:54.317410 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:54.317378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" event={"ID":"f73a6c75-f8da-4ca1-9001-9da8908c8583","Type":"ContainerStarted","Data":"e9b7de538d9f48a058935a425970fdedcf7b9160de8524869818efc00b6bc649"} Apr 16 16:36:54.317410 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:54.317411 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" event={"ID":"f73a6c75-f8da-4ca1-9001-9da8908c8583","Type":"ContainerStarted","Data":"2943d7f8db313cae44fe4f5852b036a7f64ef1369561a6a874603251b34f2d14"} Apr 16 16:36:54.317643 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:54.317503 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:36:54.334375 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:54.334333 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" podStartSLOduration=2.334320071 podStartE2EDuration="2.334320071s" podCreationTimestamp="2026-04-16 16:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:36:54.333234261 +0000 UTC m=+2170.453892204" watchObservedRunningTime="2026-04-16 16:36:54.334320071 +0000 UTC m=+2170.454978035" Apr 16 16:36:54.684898 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:54.684874 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lz8kp_72eac7d7-86dd-4e53-a6fb-3b9ea7c2b96f/node-ca/0.log" Apr 16 16:36:55.593861 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:55.593837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76688fbdc6-w7hxk_16874dd3-186f-4feb-bc62-e0eae6ce6072/router/0.log" Apr 16 16:36:56.048550 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:56.048515 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gt8zg_518655bd-6e85-4b6c-9e7d-11b4b7bc3edd/serve-healthcheck-canary/0.log" Apr 16 16:36:56.527511 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:56.527480 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-t2wfh_c8c970dd-50f2-41a4-824e-2a03a125acfd/insights-operator/0.log" Apr 16 16:36:56.547740 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:56.547714 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gjzps_42c41435-4859-4c94-aa0a-bf577233ed83/kube-rbac-proxy/0.log" Apr 16 16:36:56.569845 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:56.569818 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gjzps_42c41435-4859-4c94-aa0a-bf577233ed83/exporter/0.log" Apr 16 16:36:56.593228 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:56.593205 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gjzps_42c41435-4859-4c94-aa0a-bf577233ed83/extractor/0.log" Apr 16 16:36:59.190258 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:36:59.190227 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5879d548d6-777dg_2b007a35-c94c-4587-8a6c-c23126562d5d/manager/0.log" Apr 16 16:37:00.331133 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:00.331108 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-4t29r" Apr 16 16:37:05.030202 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:05.030176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-5szhf_30d4ab93-1d05-421c-a438-24378cc5b1b4/kube-storage-version-migrator-operator/0.log" Apr 16 16:37:06.332189 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.332159 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmfsc_803316ee-89ca-4413-8b50-f8e2771aae54/kube-multus-additional-cni-plugins/0.log" Apr 16 16:37:06.357121 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.357092 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmfsc_803316ee-89ca-4413-8b50-f8e2771aae54/egress-router-binary-copy/0.log" Apr 16 16:37:06.381826 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.381805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmfsc_803316ee-89ca-4413-8b50-f8e2771aae54/cni-plugins/0.log" Apr 16 16:37:06.407247 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.407221 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmfsc_803316ee-89ca-4413-8b50-f8e2771aae54/bond-cni-plugin/0.log" Apr 16 16:37:06.432529 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.432492 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmfsc_803316ee-89ca-4413-8b50-f8e2771aae54/routeoverride-cni/0.log" Apr 16 16:37:06.456557 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.456532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmfsc_803316ee-89ca-4413-8b50-f8e2771aae54/whereabouts-cni-bincopy/0.log" Apr 16 16:37:06.484583 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.484561 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmfsc_803316ee-89ca-4413-8b50-f8e2771aae54/whereabouts-cni/0.log" Apr 16 16:37:06.877682 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.877649 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z72rs_4f04a36e-4a07-4d11-aa7a-7a9cfb6156c2/kube-multus/0.log" Apr 16 16:37:06.901395 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.901370 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nwbqd_dc41db5b-cc00-4f67-be36-85fcec39776e/network-metrics-daemon/0.log" Apr 16 16:37:06.923409 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:06.923386 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nwbqd_dc41db5b-cc00-4f67-be36-85fcec39776e/kube-rbac-proxy/0.log" Apr 16 16:37:08.480578 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.480548 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-controller/0.log" Apr 16 16:37:08.497260 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.497237 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/0.log" Apr 16 16:37:08.507215 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.507183 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovn-acl-logging/1.log" Apr 16 16:37:08.526829 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.526809 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/kube-rbac-proxy-node/0.log" Apr 16 16:37:08.550682 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.550662 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 16:37:08.571751 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.571726 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/northd/0.log" Apr 16 16:37:08.592136 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.592113 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/nbdb/0.log" Apr 16 16:37:08.612632 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.612611 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/sbdb/0.log" Apr 16 16:37:08.709814 ip-10-0-140-78 kubenswrapper[2577]: I0416 16:37:08.709783 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-grb4s_b0890f1f-a03c-4336-8b33-5daaa65706b0/ovnkube-controller/0.log"