Apr 22 17:34:26.400014 ip-10-0-135-36 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:34:26.848508 ip-10-0-135-36 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:26.848508 ip-10-0-135-36 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:34:26.848508 ip-10-0-135-36 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:26.848508 ip-10-0-135-36 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:34:26.848508 ip-10-0-135-36 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:26.849721 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.849277 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:34:26.855406 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855388 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:26.855406 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855406 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:26.855406 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855410 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855415 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855418 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855421 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855426 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855429 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855432 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855435 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855438 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855441 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855444 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855447 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855450 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855453 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855455 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855458 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855461 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855466 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855474 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855477 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:26.855506 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855480 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855483 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855485 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855488 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855490 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855493 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855495 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855498 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855500 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855503 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855506 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855508 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855511 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855514 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855517 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855520 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855522 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855525 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855527 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855530 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:26.856044 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855532 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855535 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855537 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855546 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855548 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855552 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855554 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855557 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855559 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855562 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855564 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855567 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855570 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855572 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855575 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855578 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855581 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855583 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855586 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855588 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:26.856552 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855591 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855593 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855596 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855598 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855601 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855603 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855606 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855608 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855611 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855614 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855618 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855621 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855624 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855627 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855630 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855633 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855635 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855638 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855640 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:26.857061 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855643 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855645 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855648 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855651 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.855653 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856087 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856093 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856096 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856099 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856102 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856105 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856108 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856110 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856113 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856115 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856118 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856120 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856123 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856126 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856129 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:26.857605 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856133 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856136 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856139 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856141 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856144 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856146 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856149 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856152 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856154 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856157 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856159 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856162 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856164 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856166 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856169 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856171 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856174 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856177 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856180 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:26.858107 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856183 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856186 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856188 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856191 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856193 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856196 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856199 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856201 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856204 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856206 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856209 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856211 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856214 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856216 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856218 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856221 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856223 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856226 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856228 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856231 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:26.858581 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856233 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856236 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856238 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856241 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856243 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856245 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856247 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856250 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856252 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856255 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856257 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856260 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856264 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856267 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856270 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856272 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856274 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856278 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856282 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:26.859088 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856285 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856288 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856291 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856293 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856296 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856299 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856302 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856304 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856307 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856310 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856313 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856315 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.856318 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856398 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856406 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856412 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856417 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856422 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856425 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856430 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856434 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:34:26.859762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856438 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856441 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856444 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856448 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856452 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856456 2572 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856459 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856462 2572 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856465 2572 flags.go:64] FLAG: --cloud-config="" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856468 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856471 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856475 2572 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856477 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856480 2572 flags.go:64] FLAG: --config-dir="" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856483 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856487 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856490 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856494 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856497 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856500 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856503 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856506 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856509 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856512 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856515 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:34:26.860469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856519 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856522 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856525 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856528 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856530 2572 flags.go:64] FLAG: --enable-server="true" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856533 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856538 2572 flags.go:64] FLAG: --event-burst="100" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856541 2572 flags.go:64] FLAG: --event-qps="50" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856544 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856547 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856550 2572 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856554 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856558 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856561 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856565 2572 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856568 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856571 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856574 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856577 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856580 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856582 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856585 2572 flags.go:64] FLAG: --feature-gates="" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856589 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856592 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856596 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:34:26.861098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856599 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856602 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856605 2572 flags.go:64] FLAG: --help="false" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856608 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-135-36.ec2.internal" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856611 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856614 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856617 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856621 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856624 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856627 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856630 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856633 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856636 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856639 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856643 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856646 2572 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856649 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856652 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856655 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856658 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856661 2572 flags.go:64] FLAG: --lock-file="" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856664 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856667 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856670 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:34:26.861793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856675 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856678 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856681 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856684 2572 flags.go:64] FLAG: --logging-format="text" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856687 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856690 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856693 2572 flags.go:64] FLAG: --manifest-url="" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856710 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856714 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856720 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856724 2572 flags.go:64] FLAG: --max-pods="110" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856727 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856730 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856733 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856737 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856740 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856743 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856746 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856753 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856756 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856759 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856762 2572 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856765 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:34:26.862416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856771 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856774 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856777 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856780 2572 flags.go:64] FLAG: --port="10250" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856784 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856787 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0569363543fa1fade" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856793 2572 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856797 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856800 2572 flags.go:64] FLAG: --register-node="true" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856803 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856806 2572 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856809 2572 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856812 2572 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856815 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856818 2572 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856822 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856825 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856828 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856832 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856835 2572 flags.go:64] FLAG: --runonce="false" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856838 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856841 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856844 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856848 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856851 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856854 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:34:26.863004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856857 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856861 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856863 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856866 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856869 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856876 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856879 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856882 2572 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856885 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856890 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856893 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856896 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856899 2572 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856903 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856906 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856909 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856912 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856915 2572 flags.go:64] FLAG: --v="2" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856920 2572 flags.go:64] FLAG: --version="false" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856924 2572 flags.go:64] FLAG: --vmodule="" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856929 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.856932 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857025 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857028 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:26.863643 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857033 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857036 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857039 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857042 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857045 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857049 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857053 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857056 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857059 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857061 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857064 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857067 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857070 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857074 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857076 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857079 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857082 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857084 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857088 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:26.864275 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857092 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857095 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857098 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857101 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857104 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857107 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857109 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857113 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857115 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857118 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857120 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857123 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857125 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857130 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857133 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857135 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857139 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857141 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857143 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857146 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:26.864781 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857148 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857151 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857153 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857156 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857158 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857160 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857164 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857167 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857170 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857172 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857175 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857178 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857181 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857183 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857186 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857188 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857192 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857195 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857197 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857199 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:26.865274 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857202 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857204 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857208 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857210 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857213 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857216 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857219 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857221 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857224 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857226 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857229 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857231 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857234 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857236 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857239 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857241 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857243 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857246 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857250 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857253 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:26.865789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857256 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857258 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857261 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857263 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.857266 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.858112 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.865158 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.865177 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865230 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865237 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865240 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865243 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865246 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865250 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865252 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865255 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:26.866285 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865258 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865261 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865264 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865266 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865269 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865272 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865276 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865280 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865284 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865286 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865289 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865292 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865295 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865298 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865301 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865304 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865307 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865309 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865312 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:26.866729 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865315 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865318 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865320 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865323 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865325 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865328 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865331 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865333 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865335 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865338 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865341 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865343 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865346 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865349 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865351 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865354 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865357 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865360 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865362 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865365 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865368 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:26.867199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865370 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865373 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865375 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865378 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865380 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865383 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865385 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865390 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865395 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865398 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865401 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865403 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865406 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865409 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865411 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865414 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865417 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865419 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865422 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:26.867720 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865424 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865427 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865429 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865432 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865434 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865437 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865440 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865442 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865445 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865448 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865451 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865454 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865456 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865459 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865462 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865464 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865467 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865469 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:26.868199 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865472 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.865477 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865609 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865615 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865618 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865621 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865624 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865627 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865629 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865632 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865634 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865637 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865639 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865642 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865644 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865647 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:26.868638 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865649 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865652 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865655 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865657 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865660 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865663 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865665 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865668 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865671 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865674 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865676 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865680 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865684 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865687 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865689 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865692 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865712 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865715 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865718 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865721 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:26.869067 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865724 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865726 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865729 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865731 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865734 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865736 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865739 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865742 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865744 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865747 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865750 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865752 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865755 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865757 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865760 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865762 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865765 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865767 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865770 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865773 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:26.869557 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865775 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865779 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865781 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865784 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865786 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865789 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865791 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865794 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865797 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865800 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865802 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865805 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865808 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865810 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865813 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865815 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865818 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865820 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865823 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:26.870258 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865827 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865830 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865833 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865836 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865839 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865841 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865844 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865846 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865849 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865851 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865854 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865857 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:26.865859 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.865864 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.865988 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:34:26.870754 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.869576 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:34:26.871237 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.870731 2572 server.go:1019] "Starting client certificate rotation" Apr 22 17:34:26.871237 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.870847 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:26.871237 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.870905 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:26.895354 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.895323 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:26.897815 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.897781 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:26.914036 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.914011 2572 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:34:26.920336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.920314 2572 log.go:25] "Validated CRI v1 image API" Apr 22 17:34:26.922774 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.922752 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:34:26.926497 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.926472 2572 fs.go:135] Filesystem UUIDs: map[055f6e27-65f0-446a-93e8-9a5386f59ecc:/dev/nvme0n1p3 6df41efd-4115-4c5e-a481-2de064f7d1b3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 17:34:26.926597 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.926494 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:34:26.928260 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.928235 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:26.932426 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.932303 2572 manager.go:217] Machine: {Timestamp:2026-04-22 17:34:26.930373064 +0000 UTC m=+0.409629737 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3121037 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b590394e42f80496b4f5d46f3bb9d SystemUUID:ec2b5903-94e4-2f80-496b-4f5d46f3bb9d BootID:9473baae-ce70-48d0-96c5-b7a319031f0b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:64:cc:ad:25:ed Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:64:cc:ad:25:ed Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:31:00:15:6e:c4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:34:26.932426 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.932414 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:34:26.932599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.932538 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:34:26.935262 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.935235 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:34:26.935429 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.935263 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-36.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:34:26.935515 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.935444 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:34:26.935515 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.935457 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:34:26.935515 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.935476 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:26.935515 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.935496 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:26.936941 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.936927 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:26.937123 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.937112 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:34:26.940068 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.940056 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:34:26.940133 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.940100 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:34:26.940133 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.940124 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:34:26.940226 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.940138 2572 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:34:26.940226 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.940152 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:34:26.941952 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.941939 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:26.942016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.941963 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:26.947483 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.947467 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:34:26.948799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.948783 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:34:26.950725 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950706 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:34:26.950725 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950726 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950732 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950738 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950744 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950761 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950767 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950773 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950780 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950786 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950806 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:34:26.950836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.950819 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:34:26.951730 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.951720 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:34:26.951821 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.951733 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:34:26.952130 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.952113 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ksbvh" Apr 22 17:34:26.952781 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:26.952762 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:34:26.952813 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:26.952767 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:34:26.955361 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.955348 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:34:26.955423 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.955386 2572 server.go:1295] "Started kubelet" Apr 22 17:34:26.955517 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.955489 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:34:26.955569 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.955500 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:34:26.955569 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.955563 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:34:26.956110 ip-10-0-135-36 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:34:26.956888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.956871 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:34:26.958306 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.958293 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:34:26.961105 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.961085 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ksbvh" Apr 22 17:34:26.963022 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.962976 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:26.963100 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.963042 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-36.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:26.963554 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.963538 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:34:26.965049 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.964386 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:34:26.965049 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.964409 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:34:26.965049 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.964547 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:34:26.965049 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.964622 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:34:26.965049 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.964629 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:34:26.965299 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:26.965122 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:26.965676 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:26.963074 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-36.ec2.internal.18a8be476ba178a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-36.ec2.internal,UID:ip-10-0-135-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-36.ec2.internal,},FirstTimestamp:2026-04-22 17:34:26.955360416 +0000 UTC m=+0.434617091,LastTimestamp:2026-04-22 17:34:26.955360416 +0000 UTC m=+0.434617091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-36.ec2.internal,}" Apr 22 17:34:26.966362 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:26.966312 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:34:26.966848 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.966825 2572 factory.go:55] Registering systemd factory Apr 22 17:34:26.966919 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.966887 2572 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:34:26.967156 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.967145 2572 factory.go:153] Registering CRI-O factory Apr 22 17:34:26.967208 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.967158 2572 factory.go:223] Registration of the crio container factory successfully Apr 22 17:34:26.967255 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.967226 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:34:26.967320 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.967278 2572 factory.go:103] Registering Raw factory Apr 22 17:34:26.967320 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.967316 2572 manager.go:1196] Started watching for new ooms in manager Apr 22 17:34:26.968643 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.968390 2572 manager.go:319] Starting recovery of all containers Apr 22 17:34:26.973287 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.973268 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:26.976281 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:26.976249 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-36.ec2.internal\" not found" node="ip-10-0-135-36.ec2.internal" Apr 22 17:34:26.981242 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.981224 2572 manager.go:324] Recovery completed Apr 22 17:34:26.985928 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.985914 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:26.988529 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.988511 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:26.988609 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.988539 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:26.988609 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.988550 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:26.989101 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.989086 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:34:26.989101 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.989097 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:34:26.989222 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.989117 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:26.991782 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.991761 2572 policy_none.go:49] "None policy: Start" Apr 22 17:34:26.991821 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.991786 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:34:26.991821 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:26.991797 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.037282 2572 manager.go:341] "Starting Device Plugin manager" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.037509 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.037523 2572 server.go:85] "Starting device plugin registration server" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.037777 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.037790 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.037897 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.037972 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.037979 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.039113 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:34:27.044768 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.039148 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.092243 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.092194 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:34:27.093450 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.093425 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:34:27.093450 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.093450 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:34:27.093627 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.093469 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:34:27.093627 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.093477 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:34:27.093627 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.093516 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:34:27.095937 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.095916 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:27.138848 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.138781 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:27.139974 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.139958 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:27.140061 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.139989 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:27.140061 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.139999 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:27.140061 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.140025 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.148476 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.148457 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.148580 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.148483 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-36.ec2.internal\": node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.167368 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.167331 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.194312 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.194280 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal"] Apr 22 17:34:27.194451 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.194371 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:27.195360 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.195338 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:27.195472 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.195376 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:27.195472 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.195391 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:27.197850 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.197837 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:27.198001 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.197985 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.198043 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.198018 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:27.198621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.198606 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:27.198687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.198634 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:27.198687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.198635 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:27.198687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.198644 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:27.198687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.198654 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:27.198687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.198664 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:27.201110 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.201097 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.201155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.201123 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:27.203542 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.203523 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:27.203637 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.203555 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:27.203637 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.203571 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:27.230471 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.230449 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-36.ec2.internal\" not found" node="ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.234932 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.234913 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-36.ec2.internal\" not found" node="ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.266130 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.266102 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3e046f13e45ad604d042daea0e97d3a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal\" (UID: \"b3e046f13e45ad604d042daea0e97d3a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.266130 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.266131 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3e046f13e45ad604d042daea0e97d3a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal\" (UID: \"b3e046f13e45ad604d042daea0e97d3a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.266283 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.266151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec7cb24e845e46d881c2a9f07f361da0-config\") pod \"kube-apiserver-proxy-ip-10-0-135-36.ec2.internal\" (UID: \"ec7cb24e845e46d881c2a9f07f361da0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.268204 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.268185 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.367268 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.367225 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3e046f13e45ad604d042daea0e97d3a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal\" (UID: \"b3e046f13e45ad604d042daea0e97d3a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.367268 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.367239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3e046f13e45ad604d042daea0e97d3a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal\" (UID: \"b3e046f13e45ad604d042daea0e97d3a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.367452 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.367293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3e046f13e45ad604d042daea0e97d3a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal\" (UID: \"b3e046f13e45ad604d042daea0e97d3a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.367452 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.367315 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec7cb24e845e46d881c2a9f07f361da0-config\") pod \"kube-apiserver-proxy-ip-10-0-135-36.ec2.internal\" (UID: \"ec7cb24e845e46d881c2a9f07f361da0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.367452 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.367356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec7cb24e845e46d881c2a9f07f361da0-config\") pod \"kube-apiserver-proxy-ip-10-0-135-36.ec2.internal\" (UID: \"ec7cb24e845e46d881c2a9f07f361da0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.367452 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.367381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3e046f13e45ad604d042daea0e97d3a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal\" (UID: \"b3e046f13e45ad604d042daea0e97d3a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.369258 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.369239 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.470073 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.470002 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.532232 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.532197 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.537677 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.537659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" Apr 22 17:34:27.570736 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.570687 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.671257 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.671227 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.771816 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.771729 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.870318 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.870288 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:34:27.870895 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.870438 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:27.870895 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.870445 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:27.872415 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.872395 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.963115 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.963067 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:29:26 +0000 UTC" deadline="2027-12-09 04:01:51.58255669 +0000 UTC" Apr 22 17:34:27.963115 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.963111 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14290h27m23.619449036s" Apr 22 17:34:27.963115 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.963086 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:27.972865 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:27.972841 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:27.978670 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:27.978647 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:28.000634 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.000604 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5fg8d" Apr 22 17:34:28.008538 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.008516 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5fg8d" Apr 22 17:34:28.044171 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:28.044133 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e046f13e45ad604d042daea0e97d3a.slice/crio-0999c341b771c85687bd0a1934154580e2703d53c3ba735cf3698a721ed26645 WatchSource:0}: Error finding container 0999c341b771c85687bd0a1934154580e2703d53c3ba735cf3698a721ed26645: Status 404 returned error can't find the container with id 0999c341b771c85687bd0a1934154580e2703d53c3ba735cf3698a721ed26645 Apr 22 17:34:28.044588 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:28.044571 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec7cb24e845e46d881c2a9f07f361da0.slice/crio-696daae9a86895c5302b24efcef5687ddb6d700b86c1a43dd8b8826bb402ae38 WatchSource:0}: Error finding container 696daae9a86895c5302b24efcef5687ddb6d700b86c1a43dd8b8826bb402ae38: Status 404 returned error can't find the container with id 696daae9a86895c5302b24efcef5687ddb6d700b86c1a43dd8b8826bb402ae38 Apr 22 17:34:28.049770 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.049752 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:34:28.073923 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:28.073894 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:28.097076 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.097025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" event={"ID":"b3e046f13e45ad604d042daea0e97d3a","Type":"ContainerStarted","Data":"0999c341b771c85687bd0a1934154580e2703d53c3ba735cf3698a721ed26645"} Apr 22 17:34:28.097986 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.097963 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" event={"ID":"ec7cb24e845e46d881c2a9f07f361da0","Type":"ContainerStarted","Data":"696daae9a86895c5302b24efcef5687ddb6d700b86c1a43dd8b8826bb402ae38"} Apr 22 17:34:28.174533 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:28.174501 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-36.ec2.internal\" not found" Apr 22 17:34:28.226783 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.226753 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:28.265495 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.265463 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" Apr 22 17:34:28.275677 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.275653 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:28.277572 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.277559 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" Apr 22 17:34:28.283287 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.283268 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:28.403550 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.403522 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:28.864779 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.864751 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:28.941251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.941223 2572 apiserver.go:52] "Watching apiserver" Apr 22 17:34:28.948110 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.948087 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:34:28.949894 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.949865 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hrsxq","kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal","openshift-cluster-node-tuning-operator/tuned-xjjrz","openshift-dns/node-resolver-4nmbk","openshift-image-registry/node-ca-7q8mm","openshift-multus/network-metrics-daemon-djttm","kube-system/konnectivity-agent-8n7x2","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal","openshift-multus/multus-additional-cni-plugins-5drmj","openshift-multus/multus-q2vht","openshift-network-diagnostics/network-check-target-g829p","openshift-network-operator/iptables-alerter-j4hck"] Apr 22 17:34:28.952401 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.952380 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:28.954688 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.954667 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:34:28.954843 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.954825 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:34:28.954948 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.954930 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x6759\"" Apr 22 17:34:28.957248 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.957214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.957398 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.957379 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:28.959425 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.959288 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnsbx\"" Apr 22 17:34:28.960431 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.960025 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:28.960431 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.960047 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:34:28.960431 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.960106 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:28.960431 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.960115 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6fs2d\"" Apr 22 17:34:28.960431 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.960216 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:28.960431 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.960285 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:34:28.962180 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.962157 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:34:28.962369 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.962352 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sfkpl\"" Apr 22 17:34:28.962369 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.962361 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:34:28.962598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.962579 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:34:28.962757 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.962740 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:28.962828 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:28.962815 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:28.965030 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.965004 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.967277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.967080 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:34:28.967277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.967097 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:34:28.967277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.967152 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:34:28.967277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.967088 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t8jx7\"" Apr 22 17:34:28.967277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.967205 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:34:28.967277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.967208 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:34:28.967277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.967260 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:34:28.967951 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.967835 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:28.970213 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.970176 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:28.970497 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.970459 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:34:28.970631 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.970609 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:34:28.970746 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.970730 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:34:28.970897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.970878 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-59wpn\"" Apr 22 17:34:28.972061 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.972036 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:34:28.972214 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.972192 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:34:28.972289 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.972270 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8blbk\"" Apr 22 17:34:28.972348 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.972303 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:34:28.972861 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.972506 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:34:28.972861 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.972512 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:34:28.972861 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.972599 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q2vht" Apr 22 17:34:28.974428 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.974412 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:34:28.974517 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.974469 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-42f6v\"" Apr 22 17:34:28.975051 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.975034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:28.975131 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:28.975103 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:28.977169 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-log-socket\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977266 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-cni-bin\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977266 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5pms\" (UniqueName: \"kubernetes.io/projected/a719b6ff-4e34-4393-bec2-9239979501ec-kube-api-access-f5pms\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977266 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysconfig\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.977408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977301 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:28.977408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-sys\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.977408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-ovnkube-script-lib\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-modprobe-d\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.977408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977398 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-var-lib-kubelet\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.977637 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-host\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.977637 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-serviceca\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:28.977637 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-systemd-units\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977637 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977637 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977627 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-ovn\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c07b1eb-ef24-4289-8216-10e5782c6173-konnectivity-ca\") pod \"konnectivity-agent-8n7x2\" (UID: \"7c07b1eb-ef24-4289-8216-10e5782c6173\") " pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:28.977879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-hosts-file\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:28.977879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcnvx\" (UniqueName: \"kubernetes.io/projected/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-kube-api-access-vcnvx\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:28.977879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977787 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-host\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:28.977879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4n59\" (UniqueName: \"kubernetes.io/projected/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-kube-api-access-d4n59\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:28.977879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977836 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-kubelet\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-etc-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.977879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-cni-netd\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-ovnkube-config\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977933 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysctl-conf\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-lib-modules\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-systemd\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.977998 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-node-log\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-env-overrides\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gt6\" (UniqueName: \"kubernetes.io/projected/34c7625b-b71f-4d8d-a883-c465098dbba7-kube-api-access-52gt6\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:28.978116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysctl-d\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-run\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978144 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1faba3f3-2377-439b-94d0-db64e401cf51-etc-tuned\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-tmp-dir\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1faba3f3-2377-439b-94d0-db64e401cf51-tmp\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-run-netns\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978250 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c07b1eb-ef24-4289-8216-10e5782c6173-agent-certs\") pod \"konnectivity-agent-8n7x2\" (UID: \"7c07b1eb-ef24-4289-8216-10e5782c6173\") " pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-kubernetes\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-var-lib-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978320 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978336 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a719b6ff-4e34-4393-bec2-9239979501ec-ovn-node-metrics-cert\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-systemd\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.978460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5zj\" (UniqueName: \"kubernetes.io/projected/1faba3f3-2377-439b-94d0-db64e401cf51-kube-api-access-4p5zj\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:28.979057 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.978478 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-slash\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:28.979458 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.979413 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:34:28.979458 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.979417 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:28.979458 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.979455 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xrv6x\"" Apr 22 17:34:28.979630 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:28.979538 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:29.009954 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.009925 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:28 +0000 UTC" deadline="2028-01-28 22:22:06.468697668 +0000 UTC" Apr 22 17:34:29.009954 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.009951 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15508h47m37.458748724s" Apr 22 17:34:29.066584 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.066557 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:34:29.078978 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.078943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-ovnkube-script-lib\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.079200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.078992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-registration-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.079200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-k8s-cni-cncf-io\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.079200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079051 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-multus-certs\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.079200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dkpt\" (UniqueName: \"kubernetes.io/projected/de5c6256-7a62-4226-ba85-b1cfcfd4d404-kube-api-access-6dkpt\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.079200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-modprobe-d\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.079200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-serviceca\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:29.079200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c07b1eb-ef24-4289-8216-10e5782c6173-konnectivity-ca\") pod \"konnectivity-agent-8n7x2\" (UID: \"7c07b1eb-ef24-4289-8216-10e5782c6173\") " pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079216 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-hosts-file\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcnvx\" (UniqueName: \"kubernetes.io/projected/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-kube-api-access-vcnvx\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-modprobe-d\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-hosts-file\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-host\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-kubelet\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-host\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-ovnkube-config\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-kubelet\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysctl-conf\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-systemd\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-sys-fs\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpqrp\" (UniqueName: \"kubernetes.io/projected/38d05363-ff3e-45c2-85a4-521bd61d2119-kube-api-access-rpqrp\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.079544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079576 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52gt6\" (UniqueName: \"kubernetes.io/projected/34c7625b-b71f-4d8d-a883-c465098dbba7-kube-api-access-52gt6\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-systemd\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c07b1eb-ef24-4289-8216-10e5782c6173-konnectivity-ca\") pod \"konnectivity-agent-8n7x2\" (UID: \"7c07b1eb-ef24-4289-8216-10e5782c6173\") " pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysctl-conf\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-ovnkube-script-lib\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-serviceca\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.079840 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079861 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-run\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1faba3f3-2377-439b-94d0-db64e401cf51-etc-tuned\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.079940 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:29.579902122 +0000 UTC m=+3.059158807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-run\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twxq5\" (UniqueName: \"kubernetes.io/projected/aedd63e3-6ddf-4202-adc9-e73988dd4d87-kube-api-access-twxq5\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-ovnkube-config\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.079990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1faba3f3-2377-439b-94d0-db64e401cf51-tmp\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-os-release\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-cni-multus\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.080251 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-host-slash\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-var-lib-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-etc-selinux\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080157 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-var-lib-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-system-cni-dir\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080260 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-cnibin\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5pms\" (UniqueName: \"kubernetes.io/projected/a719b6ff-4e34-4393-bec2-9239979501ec-kube-api-access-f5pms\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-device-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-os-release\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080439 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysconfig\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080463 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-kubelet\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080485 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-var-lib-kubelet\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.081041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-host\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysconfig\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-systemd-units\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-var-lib-kubelet\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-host\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-systemd-units\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-ovn\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080657 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-netns\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-ovn\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-run-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080682 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-daemon-config\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-iptables-alerter-script\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4n59\" (UniqueName: \"kubernetes.io/projected/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-kube-api-access-d4n59\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-etc-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-cni-netd\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cni-binary-copy\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.081777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-lib-modules\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-etc-openvswitch\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-node-log\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-env-overrides\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080973 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cnibin\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.080976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-cni-netd\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-node-log\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-system-cni-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-socket-dir-parent\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-run-netns\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-lib-modules\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c07b1eb-ef24-4289-8216-10e5782c6173-agent-certs\") pod \"konnectivity-agent-8n7x2\" (UID: \"7c07b1eb-ef24-4289-8216-10e5782c6173\") " pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-run-netns\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-socket-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081262 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysctl-d\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-tmp-dir\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081310 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-cni-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.082526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-etc-kubernetes\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a719b6ff-4e34-4393-bec2-9239979501ec-env-overrides\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081402 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-sysctl-d\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081443 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de5c6256-7a62-4226-ba85-b1cfcfd4d404-cni-binary-copy\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081470 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-cni-bin\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46wv\" (UniqueName: \"kubernetes.io/projected/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-kube-api-access-g46wv\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-kubernetes\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-tmp-dir\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a719b6ff-4e34-4393-bec2-9239979501ec-ovn-node-metrics-cert\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-kubernetes\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081623 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-conf-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-systemd\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081719 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5zj\" (UniqueName: \"kubernetes.io/projected/1faba3f3-2377-439b-94d0-db64e401cf51-kube-api-access-4p5zj\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-slash\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.083321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081763 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-etc-systemd\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-log-socket\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-log-socket\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-cni-bin\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-cni-bin\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081855 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a719b6ff-4e34-4393-bec2-9239979501ec-host-slash\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081921 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-hostroot\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.081948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-sys\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.082027 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1faba3f3-2377-439b-94d0-db64e401cf51-sys\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.084131 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.084078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1faba3f3-2377-439b-94d0-db64e401cf51-tmp\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.084581 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.084327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c07b1eb-ef24-4289-8216-10e5782c6173-agent-certs\") pod \"konnectivity-agent-8n7x2\" (UID: \"7c07b1eb-ef24-4289-8216-10e5782c6173\") " pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:29.084581 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.084443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1faba3f3-2377-439b-94d0-db64e401cf51-etc-tuned\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.085000 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.084975 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a719b6ff-4e34-4393-bec2-9239979501ec-ovn-node-metrics-cert\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.094056 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.094033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcnvx\" (UniqueName: \"kubernetes.io/projected/0e0b96d6-6ddc-4b33-8123-2eec86f21a66-kube-api-access-vcnvx\") pod \"node-resolver-4nmbk\" (UID: \"0e0b96d6-6ddc-4b33-8123-2eec86f21a66\") " pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:29.094056 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.094050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5zj\" (UniqueName: \"kubernetes.io/projected/1faba3f3-2377-439b-94d0-db64e401cf51-kube-api-access-4p5zj\") pod \"tuned-xjjrz\" (UID: \"1faba3f3-2377-439b-94d0-db64e401cf51\") " pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.094241 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.094222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4n59\" (UniqueName: \"kubernetes.io/projected/2f76cec7-7e9b-4a76-a5c5-c12f9790bb38-kube-api-access-d4n59\") pod \"node-ca-7q8mm\" (UID: \"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38\") " pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:29.094304 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.094238 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5pms\" (UniqueName: \"kubernetes.io/projected/a719b6ff-4e34-4393-bec2-9239979501ec-kube-api-access-f5pms\") pod \"ovnkube-node-hrsxq\" (UID: \"a719b6ff-4e34-4393-bec2-9239979501ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.094613 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.094597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gt6\" (UniqueName: \"kubernetes.io/projected/34c7625b-b71f-4d8d-a883-c465098dbba7-kube-api-access-52gt6\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:29.183101 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183033 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-etc-selinux\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.183101 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-system-cni-dir\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183101 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-cnibin\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-device-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-os-release\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-system-cni-dir\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-etc-selinux\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-kubelet\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-netns\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-daemon-config\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-device-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-cnibin\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-iptables-alerter-script\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183351 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-kubelet\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cni-binary-copy\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-os-release\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cnibin\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-netns\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-system-cni-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183528 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cnibin\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-socket-dir-parent\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-socket-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-cni-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-system-cni-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-etc-kubernetes\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183686 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-socket-dir-parent\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de5c6256-7a62-4226-ba85-b1cfcfd4d404-cni-binary-copy\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.183943 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-cni-bin\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g46wv\" (UniqueName: \"kubernetes.io/projected/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-kube-api-access-g46wv\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-conf-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183856 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-cni-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-hostroot\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-registration-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-cni-bin\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-k8s-cni-cncf-io\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-socket-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-iptables-alerter-script\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-multus-certs\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dkpt\" (UniqueName: \"kubernetes.io/projected/de5c6256-7a62-4226-ba85-b1cfcfd4d404-kube-api-access-6dkpt\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-conf-dir\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183997 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184049 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-sys-fs\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.184661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-hostroot\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cni-binary-copy\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184092 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-registration-dir\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-k8s-cni-cncf-io\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpqrp\" (UniqueName: \"kubernetes.io/projected/38d05363-ff3e-45c2-85a4-521bd61d2119-kube-api-access-rpqrp\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.183943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-etc-kubernetes\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-run-multus-certs\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/38d05363-ff3e-45c2-85a4-521bd61d2119-sys-fs\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxq5\" (UniqueName: \"kubernetes.io/projected/aedd63e3-6ddf-4202-adc9-e73988dd4d87-kube-api-access-twxq5\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-os-release\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-cni-multus\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-host-slash\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-os-release\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aedd63e3-6ddf-4202-adc9-e73988dd4d87-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de5c6256-7a62-4226-ba85-b1cfcfd4d404-host-var-lib-cni-multus\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-host-slash\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.185479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de5c6256-7a62-4226-ba85-b1cfcfd4d404-multus-daemon-config\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.186221 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.184910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de5c6256-7a62-4226-ba85-b1cfcfd4d404-cni-binary-copy\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.197346 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.197325 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:29.197346 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.197345 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:29.197504 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.197355 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tnnsn for pod openshift-network-diagnostics/network-check-target-g829p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:29.197504 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.197444 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn podName:27c6ba53-cee0-478e-afff-6fec5a07bc6f nodeName:}" failed. No retries permitted until 2026-04-22 17:34:29.697425527 +0000 UTC m=+3.176682206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tnnsn" (UniqueName: "kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn") pod "network-check-target-g829p" (UID: "27c6ba53-cee0-478e-afff-6fec5a07bc6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:29.200161 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.200138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dkpt\" (UniqueName: \"kubernetes.io/projected/de5c6256-7a62-4226-ba85-b1cfcfd4d404-kube-api-access-6dkpt\") pod \"multus-q2vht\" (UID: \"de5c6256-7a62-4226-ba85-b1cfcfd4d404\") " pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.200273 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.200212 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpqrp\" (UniqueName: \"kubernetes.io/projected/38d05363-ff3e-45c2-85a4-521bd61d2119-kube-api-access-rpqrp\") pod \"aws-ebs-csi-driver-node-pg85n\" (UID: \"38d05363-ff3e-45c2-85a4-521bd61d2119\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.201235 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.201216 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twxq5\" (UniqueName: \"kubernetes.io/projected/aedd63e3-6ddf-4202-adc9-e73988dd4d87-kube-api-access-twxq5\") pod \"multus-additional-cni-plugins-5drmj\" (UID: \"aedd63e3-6ddf-4202-adc9-e73988dd4d87\") " pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.201729 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.201711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46wv\" (UniqueName: \"kubernetes.io/projected/b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a-kube-api-access-g46wv\") pod \"iptables-alerter-j4hck\" (UID: \"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a\") " pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.211501 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.211479 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:29.264336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.264299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:29.272345 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.272310 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" Apr 22 17:34:29.280999 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.280976 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4nmbk" Apr 22 17:34:29.287711 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.287669 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7q8mm" Apr 22 17:34:29.294243 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.294221 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:29.299837 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.299814 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" Apr 22 17:34:29.307437 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.307414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5drmj" Apr 22 17:34:29.315143 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.315105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q2vht" Apr 22 17:34:29.320882 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.320850 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j4hck" Apr 22 17:34:29.588142 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.588037 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:29.588317 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.588212 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:29.588317 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.588292 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:30.588272355 +0000 UTC m=+4.067529037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:29.721128 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.721102 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d05363_ff3e_45c2_85a4_521bd61d2119.slice/crio-2e1ef5e932c3dc5bff47565fa21783f565ceea61814babdbb64fdf6360a245ea WatchSource:0}: Error finding container 2e1ef5e932c3dc5bff47565fa21783f565ceea61814babdbb64fdf6360a245ea: Status 404 returned error can't find the container with id 2e1ef5e932c3dc5bff47565fa21783f565ceea61814babdbb64fdf6360a245ea Apr 22 17:34:29.722528 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.722504 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c07b1eb_ef24_4289_8216_10e5782c6173.slice/crio-b9a535aca0c946b5f5a7629f756101a17379198bfce893e7dd76c65f2d5789f6 WatchSource:0}: Error finding container b9a535aca0c946b5f5a7629f756101a17379198bfce893e7dd76c65f2d5789f6: Status 404 returned error can't find the container with id b9a535aca0c946b5f5a7629f756101a17379198bfce893e7dd76c65f2d5789f6 Apr 22 17:34:29.724202 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.724180 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaedd63e3_6ddf_4202_adc9_e73988dd4d87.slice/crio-d7edb838461571d55023d9abf8bdde199afc6bd116afb356a55465734f9f10c5 WatchSource:0}: Error finding container d7edb838461571d55023d9abf8bdde199afc6bd116afb356a55465734f9f10c5: Status 404 returned error can't find the container with id d7edb838461571d55023d9abf8bdde199afc6bd116afb356a55465734f9f10c5 Apr 22 17:34:29.726365 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.726264 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f76cec7_7e9b_4a76_a5c5_c12f9790bb38.slice/crio-898fc28c5d480678c526a96b8f9207dc7b46c66af8854ed7432d9cbf18f69306 WatchSource:0}: Error finding container 898fc28c5d480678c526a96b8f9207dc7b46c66af8854ed7432d9cbf18f69306: Status 404 returned error can't find the container with id 898fc28c5d480678c526a96b8f9207dc7b46c66af8854ed7432d9cbf18f69306 Apr 22 17:34:29.727131 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.727111 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda719b6ff_4e34_4393_bec2_9239979501ec.slice/crio-5303746bb375840930d26c4906ffde73570ca9be2d3f13f25bb11ae48f4fd917 WatchSource:0}: Error finding container 5303746bb375840930d26c4906ffde73570ca9be2d3f13f25bb11ae48f4fd917: Status 404 returned error can't find the container with id 5303746bb375840930d26c4906ffde73570ca9be2d3f13f25bb11ae48f4fd917 Apr 22 17:34:29.728181 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.728159 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1faba3f3_2377_439b_94d0_db64e401cf51.slice/crio-1300acfb1097c4b2a6d931545f0227a69f5c7d6e318748a578d71c2f709b10e8 WatchSource:0}: Error finding container 1300acfb1097c4b2a6d931545f0227a69f5c7d6e318748a578d71c2f709b10e8: Status 404 returned error can't find the container with id 1300acfb1097c4b2a6d931545f0227a69f5c7d6e318748a578d71c2f709b10e8 Apr 22 17:34:29.729251 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.729183 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb93ed3f0_756c_4b7f_ac31_3ef6d1bef01a.slice/crio-d302703b2840302289cf1c45b9380c7828f703116e763c0dfd972fac860c965f WatchSource:0}: Error finding container d302703b2840302289cf1c45b9380c7828f703116e763c0dfd972fac860c965f: Status 404 returned error can't find the container with id d302703b2840302289cf1c45b9380c7828f703116e763c0dfd972fac860c965f Apr 22 17:34:29.730373 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.730350 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5c6256_7a62_4226_ba85_b1cfcfd4d404.slice/crio-a3c00816bc6e254ac168c7c7c25028564ac6df06e2cff99600850c2150d401be WatchSource:0}: Error finding container a3c00816bc6e254ac168c7c7c25028564ac6df06e2cff99600850c2150d401be: Status 404 returned error can't find the container with id a3c00816bc6e254ac168c7c7c25028564ac6df06e2cff99600850c2150d401be Apr 22 17:34:29.732198 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:34:29.732175 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e0b96d6_6ddc_4b33_8123_2eec86f21a66.slice/crio-1035659b56e72603a3a5ded44c3db94a9ae359793927807e073455e990cda43d WatchSource:0}: Error finding container 1035659b56e72603a3a5ded44c3db94a9ae359793927807e073455e990cda43d: Status 404 returned error can't find the container with id 1035659b56e72603a3a5ded44c3db94a9ae359793927807e073455e990cda43d Apr 22 17:34:29.789385 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:29.789359 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:29.789507 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.789484 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:29.789507 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.789498 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:29.789507 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.789507 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tnnsn for pod openshift-network-diagnostics/network-check-target-g829p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:29.789603 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:29.789555 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn podName:27c6ba53-cee0-478e-afff-6fec5a07bc6f nodeName:}" failed. No retries permitted until 2026-04-22 17:34:30.789541306 +0000 UTC m=+4.268797983 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tnnsn" (UniqueName: "kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn") pod "network-check-target-g829p" (UID: "27c6ba53-cee0-478e-afff-6fec5a07bc6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:30.010642 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.010340 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:28 +0000 UTC" deadline="2027-10-01 13:59:30.062822677 +0000 UTC" Apr 22 17:34:30.010642 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.010533 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12644h25m0.052292765s" Apr 22 17:34:30.094247 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.094160 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:30.094407 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:30.094285 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:30.102435 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.102392 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4nmbk" event={"ID":"0e0b96d6-6ddc-4b33-8123-2eec86f21a66","Type":"ContainerStarted","Data":"1035659b56e72603a3a5ded44c3db94a9ae359793927807e073455e990cda43d"} Apr 22 17:34:30.104293 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.104264 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q2vht" event={"ID":"de5c6256-7a62-4226-ba85-b1cfcfd4d404","Type":"ContainerStarted","Data":"a3c00816bc6e254ac168c7c7c25028564ac6df06e2cff99600850c2150d401be"} Apr 22 17:34:30.106237 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.106209 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j4hck" event={"ID":"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a","Type":"ContainerStarted","Data":"d302703b2840302289cf1c45b9380c7828f703116e763c0dfd972fac860c965f"} Apr 22 17:34:30.108245 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.108065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" event={"ID":"1faba3f3-2377-439b-94d0-db64e401cf51","Type":"ContainerStarted","Data":"1300acfb1097c4b2a6d931545f0227a69f5c7d6e318748a578d71c2f709b10e8"} Apr 22 17:34:30.110723 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.110678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7q8mm" event={"ID":"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38","Type":"ContainerStarted","Data":"898fc28c5d480678c526a96b8f9207dc7b46c66af8854ed7432d9cbf18f69306"} Apr 22 17:34:30.112377 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.112348 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerStarted","Data":"d7edb838461571d55023d9abf8bdde199afc6bd116afb356a55465734f9f10c5"} Apr 22 17:34:30.114927 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.114886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8n7x2" event={"ID":"7c07b1eb-ef24-4289-8216-10e5782c6173","Type":"ContainerStarted","Data":"b9a535aca0c946b5f5a7629f756101a17379198bfce893e7dd76c65f2d5789f6"} Apr 22 17:34:30.118127 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.118106 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" event={"ID":"ec7cb24e845e46d881c2a9f07f361da0","Type":"ContainerStarted","Data":"1720f9e1a547fb5533e78715c01df2b1e49dda46e1a288515eda265a93f43b6e"} Apr 22 17:34:30.119799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.119766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"5303746bb375840930d26c4906ffde73570ca9be2d3f13f25bb11ae48f4fd917"} Apr 22 17:34:30.121784 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.121757 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" event={"ID":"38d05363-ff3e-45c2-85a4-521bd61d2119","Type":"ContainerStarted","Data":"2e1ef5e932c3dc5bff47565fa21783f565ceea61814babdbb64fdf6360a245ea"} Apr 22 17:34:30.596467 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.596426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:30.596629 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:30.596583 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:30.596679 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:30.596648 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:32.596628616 +0000 UTC m=+6.075885282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:30.798355 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:30.798184 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:30.798517 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:30.798360 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:30.798517 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:30.798383 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:30.798517 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:30.798394 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tnnsn for pod openshift-network-diagnostics/network-check-target-g829p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:30.798517 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:30.798447 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn podName:27c6ba53-cee0-478e-afff-6fec5a07bc6f nodeName:}" failed. No retries permitted until 2026-04-22 17:34:32.79843173 +0000 UTC m=+6.277688390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tnnsn" (UniqueName: "kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn") pod "network-check-target-g829p" (UID: "27c6ba53-cee0-478e-afff-6fec5a07bc6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:31.094679 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:31.094091 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:31.094679 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:31.094232 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:31.165633 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:31.164852 2572 generic.go:358] "Generic (PLEG): container finished" podID="b3e046f13e45ad604d042daea0e97d3a" containerID="407854acbf25acacf849e58800d9eaaedc9d84d7bede9d57bbbe6184e77bc796" exitCode=0 Apr 22 17:34:31.165633 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:31.165087 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" event={"ID":"b3e046f13e45ad604d042daea0e97d3a","Type":"ContainerDied","Data":"407854acbf25acacf849e58800d9eaaedc9d84d7bede9d57bbbe6184e77bc796"} Apr 22 17:34:31.197414 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:31.196344 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-36.ec2.internal" podStartSLOduration=3.1963234209999998 podStartE2EDuration="3.196323421s" podCreationTimestamp="2026-04-22 17:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:30.139552661 +0000 UTC m=+3.618809344" watchObservedRunningTime="2026-04-22 17:34:31.196323421 +0000 UTC m=+4.675580103" Apr 22 17:34:32.093727 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:32.093675 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:32.093904 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:32.093835 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:32.187032 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:32.186994 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" event={"ID":"b3e046f13e45ad604d042daea0e97d3a","Type":"ContainerStarted","Data":"532a824b6d0ca86f86115594c57104246ccb9ce3099abd3e3914365c5a128e68"} Apr 22 17:34:32.610896 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:32.610853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:32.611095 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:32.611013 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:32.611095 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:32.611076 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:36.611056715 +0000 UTC m=+10.090313381 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:32.812062 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:32.812023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:32.812236 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:32.812223 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:32.812296 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:32.812249 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:32.812296 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:32.812263 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tnnsn for pod openshift-network-diagnostics/network-check-target-g829p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:32.812406 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:32.812330 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn podName:27c6ba53-cee0-478e-afff-6fec5a07bc6f nodeName:}" failed. No retries permitted until 2026-04-22 17:34:36.812310669 +0000 UTC m=+10.291567344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tnnsn" (UniqueName: "kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn") pod "network-check-target-g829p" (UID: "27c6ba53-cee0-478e-afff-6fec5a07bc6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:33.094846 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:33.094765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:33.094993 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:33.094920 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:34.094109 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:34.093911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:34.094109 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:34.094042 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:35.094777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:35.094740 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:35.095169 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:35.094884 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:36.094667 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:36.094628 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:36.094871 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:36.094787 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:36.639796 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:36.639743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:36.639997 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:36.639901 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:36.639997 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:36.639987 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:44.639964338 +0000 UTC m=+18.119221011 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:36.841070 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:36.841029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:36.841247 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:36.841219 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:36.841247 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:36.841244 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:36.841355 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:36.841257 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tnnsn for pod openshift-network-diagnostics/network-check-target-g829p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:36.841355 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:36.841319 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn podName:27c6ba53-cee0-478e-afff-6fec5a07bc6f nodeName:}" failed. No retries permitted until 2026-04-22 17:34:44.841304445 +0000 UTC m=+18.320561107 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tnnsn" (UniqueName: "kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn") pod "network-check-target-g829p" (UID: "27c6ba53-cee0-478e-afff-6fec5a07bc6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:37.095405 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:37.095305 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:37.095852 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:37.095429 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:38.094510 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:38.094449 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:38.094723 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:38.094581 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:39.094852 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.094819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:39.095318 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:39.094959 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:39.203284 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.203243 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" event={"ID":"1faba3f3-2377-439b-94d0-db64e401cf51","Type":"ContainerStarted","Data":"454c61888ebcfd97628222fe1190ea9147294f3cfccc419852229b16097de398"} Apr 22 17:34:39.204958 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.204885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7q8mm" event={"ID":"2f76cec7-7e9b-4a76-a5c5-c12f9790bb38","Type":"ContainerStarted","Data":"a8527a2ccb2d37a3e7173228beebe5d4b8eac0cecffef0feb492522417007710"} Apr 22 17:34:39.206299 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.206276 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerStarted","Data":"9a8e2fcbd56d08f7dc2a5dc288a25023c380e08b34316e768b55bf35c845d78c"} Apr 22 17:34:39.207691 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.207665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8n7x2" event={"ID":"7c07b1eb-ef24-4289-8216-10e5782c6173","Type":"ContainerStarted","Data":"5509a3409bcac9ff8dac3d2ce2643e359ce679dee1ec2868772a0fd62e04dcfc"} Apr 22 17:34:39.209214 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.209189 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" event={"ID":"38d05363-ff3e-45c2-85a4-521bd61d2119","Type":"ContainerStarted","Data":"53bd3b1a4be9866a9e0cc51c9e87b80745c1eaec08c1c6a134a83bb95abd68cb"} Apr 22 17:34:39.210465 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.210437 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4nmbk" event={"ID":"0e0b96d6-6ddc-4b33-8123-2eec86f21a66","Type":"ContainerStarted","Data":"32d29832ce110693f270344f73e3516194c732772b2cf645ad0b4413836e5316"} Apr 22 17:34:39.223378 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.223326 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xjjrz" podStartSLOduration=3.413647287 podStartE2EDuration="12.223312322s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.730247729 +0000 UTC m=+3.209504389" lastFinishedPulling="2026-04-22 17:34:38.539912759 +0000 UTC m=+12.019169424" observedRunningTime="2026-04-22 17:34:39.223152935 +0000 UTC m=+12.702409617" watchObservedRunningTime="2026-04-22 17:34:39.223312322 +0000 UTC m=+12.702569005" Apr 22 17:34:39.223782 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.223744 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-36.ec2.internal" podStartSLOduration=11.223731281 podStartE2EDuration="11.223731281s" podCreationTimestamp="2026-04-22 17:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:32.209349881 +0000 UTC m=+5.688606569" watchObservedRunningTime="2026-04-22 17:34:39.223731281 +0000 UTC m=+12.702987965" Apr 22 17:34:39.262108 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.262046 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8n7x2" podStartSLOduration=3.461663097 podStartE2EDuration="12.262029385s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.724890774 +0000 UTC m=+3.204147435" lastFinishedPulling="2026-04-22 17:34:38.525257057 +0000 UTC m=+12.004513723" observedRunningTime="2026-04-22 17:34:39.261042592 +0000 UTC m=+12.740299275" watchObservedRunningTime="2026-04-22 17:34:39.262029385 +0000 UTC m=+12.741286068" Apr 22 17:34:39.278449 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.278340 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4nmbk" podStartSLOduration=3.487830537 podStartE2EDuration="12.278316129s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.734785366 +0000 UTC m=+3.214042026" lastFinishedPulling="2026-04-22 17:34:38.525270941 +0000 UTC m=+12.004527618" observedRunningTime="2026-04-22 17:34:39.277283908 +0000 UTC m=+12.756540594" watchObservedRunningTime="2026-04-22 17:34:39.278316129 +0000 UTC m=+12.757572812" Apr 22 17:34:39.293117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.293073 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7q8mm" podStartSLOduration=3.4966777860000002 podStartE2EDuration="12.293055673s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.728046551 +0000 UTC m=+3.207303216" lastFinishedPulling="2026-04-22 17:34:38.524424433 +0000 UTC m=+12.003681103" observedRunningTime="2026-04-22 17:34:39.292460337 +0000 UTC m=+12.771717020" watchObservedRunningTime="2026-04-22 17:34:39.293055673 +0000 UTC m=+12.772312357" Apr 22 17:34:39.593386 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.593274 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:39.594548 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:39.594524 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:40.094171 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:40.093947 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:40.094316 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:40.094288 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:40.213984 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:40.213940 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j4hck" event={"ID":"b93ed3f0-756c-4b7f-ac31-3ef6d1bef01a","Type":"ContainerStarted","Data":"e473371a1cd6fa60c2863dbe68ceee15a9740e0860d0cf2e4812fc444d6e2bce"} Apr 22 17:34:40.214907 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:40.214877 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:40.215345 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:40.215326 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8n7x2" Apr 22 17:34:40.254435 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:40.254373 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-j4hck" podStartSLOduration=4.46271022 podStartE2EDuration="13.2543528s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.7336276 +0000 UTC m=+3.212884273" lastFinishedPulling="2026-04-22 17:34:38.525270178 +0000 UTC m=+12.004526853" observedRunningTime="2026-04-22 17:34:40.234574324 +0000 UTC m=+13.713831009" watchObservedRunningTime="2026-04-22 17:34:40.2543528 +0000 UTC m=+13.733609530" Apr 22 17:34:41.094347 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:41.094319 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:41.094548 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:41.094456 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:42.094017 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:42.093974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:42.094510 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:42.094123 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:43.093797 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:43.093761 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:43.093976 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:43.093875 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:44.093938 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:44.093905 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:44.094576 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:44.094024 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:44.222233 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:44.222196 2572 generic.go:358] "Generic (PLEG): container finished" podID="aedd63e3-6ddf-4202-adc9-e73988dd4d87" containerID="9a8e2fcbd56d08f7dc2a5dc288a25023c380e08b34316e768b55bf35c845d78c" exitCode=0 Apr 22 17:34:44.222233 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:44.222246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerDied","Data":"9a8e2fcbd56d08f7dc2a5dc288a25023c380e08b34316e768b55bf35c845d78c"} Apr 22 17:34:44.701971 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:44.701934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:44.702143 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:44.702057 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:44.702143 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:44.702130 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.702107643 +0000 UTC m=+34.181364307 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:44.903513 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:44.903477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:44.903686 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:44.903633 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:44.903686 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:44.903652 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:44.903686 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:44.903662 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tnnsn for pod openshift-network-diagnostics/network-check-target-g829p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:44.903875 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:44.903740 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn podName:27c6ba53-cee0-478e-afff-6fec5a07bc6f nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.903722222 +0000 UTC m=+34.382978887 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tnnsn" (UniqueName: "kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn") pod "network-check-target-g829p" (UID: "27c6ba53-cee0-478e-afff-6fec5a07bc6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:45.094375 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:45.094284 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:45.094898 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:45.094419 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:46.094115 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:46.094078 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:46.094300 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:46.094186 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:47.095565 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:47.095505 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:47.096134 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:47.095643 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:47.806503 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:47.806276 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:34:48.050633 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.050198 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:34:47.806498784Z","UUID":"2cf3f439-6951-423e-aaef-c9d4455256a5","Handler":null,"Name":"","Endpoint":""} Apr 22 17:34:48.053290 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.053267 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:34:48.053377 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.053297 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:34:48.093795 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.093765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:48.093975 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:48.093880 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:48.231429 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.231320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q2vht" event={"ID":"de5c6256-7a62-4226-ba85-b1cfcfd4d404","Type":"ContainerStarted","Data":"147f8abe88e0c878bc48b5b0a404adebf40f9938bf3bacc98194af225b428d51"} Apr 22 17:34:48.235692 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.235657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"41516747008a23c10e914363e7f7676eacd208e59e5807e6e3b5a0c80d962166"} Apr 22 17:34:48.235692 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.235737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"bb9edd2b19db53134823c95e34a98d5292addba3b68655a98dcb75561331429c"} Apr 22 17:34:48.235920 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.235760 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"3a206bed5598e1aea64ffb883cb68b097ec680e8c4dba2691d20085ae0be4f71"} Apr 22 17:34:48.235920 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.235773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"d2a58e1d4d32d4dcb061298237e980855eabc122c14628f529628f0fc29b9396"} Apr 22 17:34:48.235920 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.235785 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"756efb7c8528d9aa155ca698bd97b52c4022df3fe3abd1cec1272e8b9280d20d"} Apr 22 17:34:48.235920 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.235797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"ae80da7a4469d5e1d7eb94a6b5a11296d66357a9cb8378c70f82ae9fc65045e4"} Apr 22 17:34:48.237883 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.237856 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" event={"ID":"38d05363-ff3e-45c2-85a4-521bd61d2119","Type":"ContainerStarted","Data":"0458d47ce7d31fa4793957e7dbd9afd7c43cba22d6f4c3888d6e9937843e007c"} Apr 22 17:34:48.251027 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:48.250978 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q2vht" podStartSLOduration=3.609053003 podStartE2EDuration="21.250959429s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.734298848 +0000 UTC m=+3.213555511" lastFinishedPulling="2026-04-22 17:34:47.376205271 +0000 UTC m=+20.855461937" observedRunningTime="2026-04-22 17:34:48.250478689 +0000 UTC m=+21.729735372" watchObservedRunningTime="2026-04-22 17:34:48.250959429 +0000 UTC m=+21.730216112" Apr 22 17:34:49.094168 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:49.094130 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:49.094339 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:49.094256 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:50.094228 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:50.094194 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:50.094673 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:50.094311 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:50.246747 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:50.246712 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" event={"ID":"38d05363-ff3e-45c2-85a4-521bd61d2119","Type":"ContainerStarted","Data":"e176bec96af064121f13dc6d227f02ebc00f178502d90929a2fa6c8d80656dc1"} Apr 22 17:34:51.094760 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:51.094659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:51.095239 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:51.094808 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:51.251141 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:51.251110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"1196b881cabf92890b65eaf08829404928c7911bf0e51cb222474a0d58a18d5a"} Apr 22 17:34:52.094521 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:52.094484 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:52.094690 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:52.094592 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:52.254301 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:52.254215 2572 generic.go:358] "Generic (PLEG): container finished" podID="aedd63e3-6ddf-4202-adc9-e73988dd4d87" containerID="0b11ab63aaea2ea479e914107ac880b0107b088a85a02d565ae0f25b756c5a44" exitCode=0 Apr 22 17:34:52.254301 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:52.254272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerDied","Data":"0b11ab63aaea2ea479e914107ac880b0107b088a85a02d565ae0f25b756c5a44"} Apr 22 17:34:52.276017 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:52.275975 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg85n" podStartSLOduration=5.930281056 podStartE2EDuration="25.275960788s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.722864943 +0000 UTC m=+3.202121617" lastFinishedPulling="2026-04-22 17:34:49.068544675 +0000 UTC m=+22.547801349" observedRunningTime="2026-04-22 17:34:50.266397653 +0000 UTC m=+23.745654347" watchObservedRunningTime="2026-04-22 17:34:52.275960788 +0000 UTC m=+25.755217538" Apr 22 17:34:53.094196 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:53.094012 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:53.094356 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:53.094310 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:53.260402 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:53.260265 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" event={"ID":"a719b6ff-4e34-4393-bec2-9239979501ec","Type":"ContainerStarted","Data":"5b259a2530b206a1368ffdd2f939ac36a83eba571d0e1e2f762a56c196119e97"} Apr 22 17:34:53.261086 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:53.260500 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:53.261086 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:53.260529 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:53.262497 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:53.262469 2572 generic.go:358] "Generic (PLEG): container finished" podID="aedd63e3-6ddf-4202-adc9-e73988dd4d87" containerID="e0057d96d707d4de9fde2c24ba64b80036ee022bcdda4287aebc11bd5e44d752" exitCode=0 Apr 22 17:34:53.262620 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:53.262503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerDied","Data":"e0057d96d707d4de9fde2c24ba64b80036ee022bcdda4287aebc11bd5e44d752"} Apr 22 17:34:53.276211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:53.276185 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:53.308735 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:53.308670 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" podStartSLOduration=8.661346216 podStartE2EDuration="26.308656346s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.729160463 +0000 UTC m=+3.208417140" lastFinishedPulling="2026-04-22 17:34:47.376470596 +0000 UTC m=+20.855727270" observedRunningTime="2026-04-22 17:34:53.30835061 +0000 UTC m=+26.787607291" watchObservedRunningTime="2026-04-22 17:34:53.308656346 +0000 UTC m=+26.787913028" Apr 22 17:34:54.094776 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.094734 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:54.094970 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:54.094875 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:54.266818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.266726 2572 generic.go:358] "Generic (PLEG): container finished" podID="aedd63e3-6ddf-4202-adc9-e73988dd4d87" containerID="adca3f675bdb923b345b5a2e05bac4e1bf7ace7d4218f0a31923f52d13af194c" exitCode=0 Apr 22 17:34:54.266818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.266797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerDied","Data":"adca3f675bdb923b345b5a2e05bac4e1bf7ace7d4218f0a31923f52d13af194c"} Apr 22 17:34:54.267460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.267441 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:54.281923 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.281898 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:34:54.846190 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.846157 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-djttm"] Apr 22 17:34:54.846418 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.846312 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:54.846480 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:54.846431 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:54.864327 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.864243 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-g829p"] Apr 22 17:34:54.864470 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:54.864354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:54.864470 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:54.864441 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:56.094325 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:56.094280 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:56.094909 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:56.094429 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:57.095405 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:57.095369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:57.095855 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:57.095501 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:34:58.094451 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:58.094418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:34:58.094619 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:58.094524 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:34:59.094276 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:34:59.094075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:34:59.094796 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:34:59.094388 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:35:00.094398 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.094372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:35:00.094768 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:00.094475 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g829p" podUID="27c6ba53-cee0-478e-afff-6fec5a07bc6f" Apr 22 17:35:00.281297 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.281224 2572 generic.go:358] "Generic (PLEG): container finished" podID="aedd63e3-6ddf-4202-adc9-e73988dd4d87" containerID="c5e56b29e979061bf959e5758eff7fe19bd24a4d11c8cd76c4a50c0f2fd00d32" exitCode=0 Apr 22 17:35:00.281297 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.281270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerDied","Data":"c5e56b29e979061bf959e5758eff7fe19bd24a4d11c8cd76c4a50c0f2fd00d32"} Apr 22 17:35:00.720815 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.720721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:35:00.720979 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:00.720814 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:00.720979 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:00.720894 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:32.720873999 +0000 UTC m=+66.200130678 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:00.857792 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.857760 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-36.ec2.internal" event="NodeReady" Apr 22 17:35:00.857930 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.857870 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:35:00.913446 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.913417 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m45f5"] Apr 22 17:35:00.916278 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.916261 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-66csc"] Apr 22 17:35:00.916430 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.916413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:00.918809 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.918784 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:00.918966 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.918865 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7g4b5\"" Apr 22 17:35:00.918966 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.918879 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:35:00.918966 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.918865 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:35:00.921338 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.921321 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:35:00.921392 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.921366 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dsn9k\"" Apr 22 17:35:00.921732 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.921716 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:35:00.921788 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.921765 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:35:00.922256 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.922238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:35:00.922415 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:00.922400 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:35:00.922470 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:00.922419 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:35:00.922470 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:00.922428 2572 projected.go:194] Error preparing data for projected volume kube-api-access-tnnsn for pod openshift-network-diagnostics/network-check-target-g829p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:00.922739 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:00.922474 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn podName:27c6ba53-cee0-478e-afff-6fec5a07bc6f nodeName:}" failed. No retries permitted until 2026-04-22 17:35:32.922461927 +0000 UTC m=+66.401718590 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-tnnsn" (UniqueName: "kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn") pod "network-check-target-g829p" (UID: "27c6ba53-cee0-478e-afff-6fec5a07bc6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:00.929142 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.929119 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m45f5"] Apr 22 17:35:00.944647 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:00.944624 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-66csc"] Apr 22 17:35:01.023317 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.023220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-tmp-dir\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.023317 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.023264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqqf\" (UniqueName: \"kubernetes.io/projected/7c3b2116-a369-4692-8afd-099a5a5a39cd-kube-api-access-ctqqf\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:01.023317 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.023282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-config-volume\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.023539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.023381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.023539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.023411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lvxz\" (UniqueName: \"kubernetes.io/projected/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-kube-api-access-9lvxz\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.023539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.023473 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:01.093801 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.093761 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:35:01.096500 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.096480 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kkblr\"" Apr 22 17:35:01.096950 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.096518 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:35:01.123968 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.123937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:01.124117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.123978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-tmp-dir\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.124117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.124004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqqf\" (UniqueName: \"kubernetes.io/projected/7c3b2116-a369-4692-8afd-099a5a5a39cd-kube-api-access-ctqqf\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:01.124117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.124056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-config-volume\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.124117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.124105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.124293 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:01.124122 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:01.124293 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.124134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lvxz\" (UniqueName: \"kubernetes.io/projected/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-kube-api-access-9lvxz\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.124293 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:01.124172 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert podName:7c3b2116-a369-4692-8afd-099a5a5a39cd nodeName:}" failed. No retries permitted until 2026-04-22 17:35:01.624157983 +0000 UTC m=+35.103414663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert") pod "ingress-canary-66csc" (UID: "7c3b2116-a369-4692-8afd-099a5a5a39cd") : secret "canary-serving-cert" not found Apr 22 17:35:01.124293 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:01.124248 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:01.124483 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:01.124309 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls podName:8da954ec-226e-4e6e-a43d-0ef4bc182e6c nodeName:}" failed. No retries permitted until 2026-04-22 17:35:01.62429134 +0000 UTC m=+35.103548007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls") pod "dns-default-m45f5" (UID: "8da954ec-226e-4e6e-a43d-0ef4bc182e6c") : secret "dns-default-metrics-tls" not found Apr 22 17:35:01.124483 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.124457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-tmp-dir\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.124580 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.124564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-config-volume\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.135066 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.135044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lvxz\" (UniqueName: \"kubernetes.io/projected/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-kube-api-access-9lvxz\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.135171 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.135095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqqf\" (UniqueName: \"kubernetes.io/projected/7c3b2116-a369-4692-8afd-099a5a5a39cd-kube-api-access-ctqqf\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:01.288245 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.288157 2572 generic.go:358] "Generic (PLEG): container finished" podID="aedd63e3-6ddf-4202-adc9-e73988dd4d87" containerID="b1e14646a4a5424f9aed45cc109221d9288fea6297c5a5d2de2f94321ec936d3" exitCode=0 Apr 22 17:35:01.288245 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.288231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerDied","Data":"b1e14646a4a5424f9aed45cc109221d9288fea6297c5a5d2de2f94321ec936d3"} Apr 22 17:35:01.628171 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.628075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:01.628171 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:01.628134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:01.628362 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:01.628212 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:01.628362 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:01.628214 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:01.628362 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:01.628263 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert podName:7c3b2116-a369-4692-8afd-099a5a5a39cd nodeName:}" failed. No retries permitted until 2026-04-22 17:35:02.628248941 +0000 UTC m=+36.107505601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert") pod "ingress-canary-66csc" (UID: "7c3b2116-a369-4692-8afd-099a5a5a39cd") : secret "canary-serving-cert" not found Apr 22 17:35:01.628362 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:01.628275 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls podName:8da954ec-226e-4e6e-a43d-0ef4bc182e6c nodeName:}" failed. No retries permitted until 2026-04-22 17:35:02.628269771 +0000 UTC m=+36.107526431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls") pod "dns-default-m45f5" (UID: "8da954ec-226e-4e6e-a43d-0ef4bc182e6c") : secret "dns-default-metrics-tls" not found Apr 22 17:35:02.094308 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:02.094224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:35:02.097117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:02.097099 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:35:02.097918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:02.097899 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:35:02.097969 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:02.097918 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lgj26\"" Apr 22 17:35:02.293321 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:02.293287 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5drmj" event={"ID":"aedd63e3-6ddf-4202-adc9-e73988dd4d87","Type":"ContainerStarted","Data":"ac8907ffe3ebce93e0ee242b3ddd7143b6f4234984e0f97f810f39854b929f77"} Apr 22 17:35:02.315741 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:02.315677 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5drmj" podStartSLOduration=5.027349661 podStartE2EDuration="35.315662841s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:29.725994943 +0000 UTC m=+3.205251603" lastFinishedPulling="2026-04-22 17:35:00.014308106 +0000 UTC m=+33.493564783" observedRunningTime="2026-04-22 17:35:02.31545908 +0000 UTC m=+35.794715762" watchObservedRunningTime="2026-04-22 17:35:02.315662841 +0000 UTC m=+35.794919523" Apr 22 17:35:02.634338 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:02.634301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:02.634501 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:02.634365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:02.634501 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:02.634446 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:02.634579 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:02.634515 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert podName:7c3b2116-a369-4692-8afd-099a5a5a39cd nodeName:}" failed. No retries permitted until 2026-04-22 17:35:04.634499459 +0000 UTC m=+38.113756132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert") pod "ingress-canary-66csc" (UID: "7c3b2116-a369-4692-8afd-099a5a5a39cd") : secret "canary-serving-cert" not found Apr 22 17:35:02.634579 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:02.634448 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:02.634657 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:02.634583 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls podName:8da954ec-226e-4e6e-a43d-0ef4bc182e6c nodeName:}" failed. No retries permitted until 2026-04-22 17:35:04.634569812 +0000 UTC m=+38.113826471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls") pod "dns-default-m45f5" (UID: "8da954ec-226e-4e6e-a43d-0ef4bc182e6c") : secret "dns-default-metrics-tls" not found Apr 22 17:35:04.649001 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:04.648970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:04.649345 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:04.649020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:04.649345 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:04.649108 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:04.649345 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:04.649113 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:04.649345 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:04.649162 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert podName:7c3b2116-a369-4692-8afd-099a5a5a39cd nodeName:}" failed. No retries permitted until 2026-04-22 17:35:08.649146978 +0000 UTC m=+42.128403638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert") pod "ingress-canary-66csc" (UID: "7c3b2116-a369-4692-8afd-099a5a5a39cd") : secret "canary-serving-cert" not found Apr 22 17:35:04.649345 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:04.649176 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls podName:8da954ec-226e-4e6e-a43d-0ef4bc182e6c nodeName:}" failed. No retries permitted until 2026-04-22 17:35:08.649169799 +0000 UTC m=+42.128426458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls") pod "dns-default-m45f5" (UID: "8da954ec-226e-4e6e-a43d-0ef4bc182e6c") : secret "dns-default-metrics-tls" not found Apr 22 17:35:08.680007 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:08.679971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:08.680007 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:08.680024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:08.680420 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:08.680113 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:08.680420 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:08.680126 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:08.680420 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:08.680164 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert podName:7c3b2116-a369-4692-8afd-099a5a5a39cd nodeName:}" failed. No retries permitted until 2026-04-22 17:35:16.680151325 +0000 UTC m=+50.159407984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert") pod "ingress-canary-66csc" (UID: "7c3b2116-a369-4692-8afd-099a5a5a39cd") : secret "canary-serving-cert" not found Apr 22 17:35:08.680420 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:08.680178 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls podName:8da954ec-226e-4e6e-a43d-0ef4bc182e6c nodeName:}" failed. No retries permitted until 2026-04-22 17:35:16.680171965 +0000 UTC m=+50.159428625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls") pod "dns-default-m45f5" (UID: "8da954ec-226e-4e6e-a43d-0ef4bc182e6c") : secret "dns-default-metrics-tls" not found Apr 22 17:35:16.737052 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:16.736849 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:16.737540 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:16.737084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:16.737540 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:16.736995 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:16.737540 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:16.737166 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls podName:8da954ec-226e-4e6e-a43d-0ef4bc182e6c nodeName:}" failed. No retries permitted until 2026-04-22 17:35:32.737150658 +0000 UTC m=+66.216407318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls") pod "dns-default-m45f5" (UID: "8da954ec-226e-4e6e-a43d-0ef4bc182e6c") : secret "dns-default-metrics-tls" not found Apr 22 17:35:16.737540 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:16.737206 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:16.737540 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:16.737263 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert podName:7c3b2116-a369-4692-8afd-099a5a5a39cd nodeName:}" failed. No retries permitted until 2026-04-22 17:35:32.737248232 +0000 UTC m=+66.216504909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert") pod "ingress-canary-66csc" (UID: "7c3b2116-a369-4692-8afd-099a5a5a39cd") : secret "canary-serving-cert" not found Apr 22 17:35:26.281481 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:26.281451 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hrsxq" Apr 22 17:35:32.748248 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:32.748211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:35:32.748248 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:32.748251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:35:32.748891 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:32.748275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:35:32.748891 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:32.748356 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:32.748891 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:32.748357 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:32.748891 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:32.748413 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert podName:7c3b2116-a369-4692-8afd-099a5a5a39cd nodeName:}" failed. No retries permitted until 2026-04-22 17:36:04.748398342 +0000 UTC m=+98.227655016 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert") pod "ingress-canary-66csc" (UID: "7c3b2116-a369-4692-8afd-099a5a5a39cd") : secret "canary-serving-cert" not found Apr 22 17:35:32.748891 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:32.748425 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls podName:8da954ec-226e-4e6e-a43d-0ef4bc182e6c nodeName:}" failed. No retries permitted until 2026-04-22 17:36:04.748419927 +0000 UTC m=+98.227676587 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls") pod "dns-default-m45f5" (UID: "8da954ec-226e-4e6e-a43d-0ef4bc182e6c") : secret "dns-default-metrics-tls" not found Apr 22 17:35:32.750867 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:32.750850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:35:32.758592 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:32.758577 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:35:32.758669 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:35:32.758622 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:36.75860982 +0000 UTC m=+130.237866484 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : secret "metrics-daemon-secret" not found Apr 22 17:35:32.950797 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:32.950758 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:35:32.953810 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:32.953765 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:35:32.963808 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:32.963787 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:35:32.974602 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:32.974581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnsn\" (UniqueName: \"kubernetes.io/projected/27c6ba53-cee0-478e-afff-6fec5a07bc6f-kube-api-access-tnnsn\") pod \"network-check-target-g829p\" (UID: \"27c6ba53-cee0-478e-afff-6fec5a07bc6f\") " pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:35:33.004895 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:33.004825 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lgj26\"" Apr 22 17:35:33.012988 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:33.012967 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:35:33.200672 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:33.200641 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-g829p"] Apr 22 17:35:33.211005 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:35:33.210977 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27c6ba53_cee0_478e_afff_6fec5a07bc6f.slice/crio-b7a0dd7746a158c417c6574caf55765a4468c881fcbd7a6e6f33f58de07ac0ce WatchSource:0}: Error finding container b7a0dd7746a158c417c6574caf55765a4468c881fcbd7a6e6f33f58de07ac0ce: Status 404 returned error can't find the container with id b7a0dd7746a158c417c6574caf55765a4468c881fcbd7a6e6f33f58de07ac0ce Apr 22 17:35:33.352878 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:33.352795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-g829p" event={"ID":"27c6ba53-cee0-478e-afff-6fec5a07bc6f","Type":"ContainerStarted","Data":"b7a0dd7746a158c417c6574caf55765a4468c881fcbd7a6e6f33f58de07ac0ce"} Apr 22 17:35:36.362211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:36.362161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-g829p" event={"ID":"27c6ba53-cee0-478e-afff-6fec5a07bc6f","Type":"ContainerStarted","Data":"6e92f81b401ba27a7a98dff84a6ae83d6d4bd405799e56e8717f0ba970d1c9e0"} Apr 22 17:35:36.362614 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:36.362423 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:35:36.379929 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:35:36.379883 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-g829p" podStartSLOduration=66.7233619 podStartE2EDuration="1m9.379868956s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:35:33.212812481 +0000 UTC m=+66.692069142" lastFinishedPulling="2026-04-22 17:35:35.869319538 +0000 UTC m=+69.348576198" observedRunningTime="2026-04-22 17:35:36.378994427 +0000 UTC m=+69.858251108" watchObservedRunningTime="2026-04-22 17:35:36.379868956 +0000 UTC m=+69.859125637" Apr 22 17:36:04.774448 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:04.774410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:36:04.774893 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:04.774466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:36:04.774893 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:04.774544 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:36:04.774893 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:04.774547 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:36:04.774893 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:04.774600 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls podName:8da954ec-226e-4e6e-a43d-0ef4bc182e6c nodeName:}" failed. No retries permitted until 2026-04-22 17:37:08.774585298 +0000 UTC m=+162.253841959 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls") pod "dns-default-m45f5" (UID: "8da954ec-226e-4e6e-a43d-0ef4bc182e6c") : secret "dns-default-metrics-tls" not found Apr 22 17:36:04.774893 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:04.774615 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert podName:7c3b2116-a369-4692-8afd-099a5a5a39cd nodeName:}" failed. No retries permitted until 2026-04-22 17:37:08.774607746 +0000 UTC m=+162.253864406 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert") pod "ingress-canary-66csc" (UID: "7c3b2116-a369-4692-8afd-099a5a5a39cd") : secret "canary-serving-cert" not found Apr 22 17:36:07.366798 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:07.366763 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-g829p" Apr 22 17:36:23.193936 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.193900 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq"] Apr 22 17:36:23.196857 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.196833 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.198972 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.198950 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 17:36:23.199078 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.199004 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:36:23.199078 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.199042 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:36:23.200800 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.200781 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-k5749\"" Apr 22 17:36:23.201046 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.201033 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 17:36:23.206692 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.206673 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq"] Apr 22 17:36:23.299960 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.299929 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k6554"] Apr 22 17:36:23.302721 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.302687 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.305078 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.305045 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:36:23.305211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.305089 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:36:23.305211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.305125 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 17:36:23.305211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.305152 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 17:36:23.305462 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.305446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.305462 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.305449 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-ss2tf\"" Apr 22 17:36:23.305576 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.305506 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7676326c-368f-406c-a545-55cebded1f2b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.305576 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.305561 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7j6\" (UniqueName: \"kubernetes.io/projected/7676326c-368f-406c-a545-55cebded1f2b-kube-api-access-4f7j6\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.310942 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.310923 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 17:36:23.312301 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.312280 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k6554"] Apr 22 17:36:23.406738 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.406681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.406911 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.406766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7676326c-368f-406c-a545-55cebded1f2b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.406911 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.406800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxwgz\" (UniqueName: \"kubernetes.io/projected/611abd7b-ca37-43e2-a480-f0fc4b1aa306-kube-api-access-pxwgz\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.406911 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:23.406835 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:23.406911 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:23.406905 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls podName:7676326c-368f-406c-a545-55cebded1f2b nodeName:}" failed. No retries permitted until 2026-04-22 17:36:23.906888871 +0000 UTC m=+117.386145536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-knbfq" (UID: "7676326c-368f-406c-a545-55cebded1f2b") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:23.407068 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.406935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/611abd7b-ca37-43e2-a480-f0fc4b1aa306-tmp\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.407068 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.406990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7j6\" (UniqueName: \"kubernetes.io/projected/7676326c-368f-406c-a545-55cebded1f2b-kube-api-access-4f7j6\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.407068 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.407030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611abd7b-ca37-43e2-a480-f0fc4b1aa306-serving-cert\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.407182 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.407084 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611abd7b-ca37-43e2-a480-f0fc4b1aa306-service-ca-bundle\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.407182 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.407149 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/611abd7b-ca37-43e2-a480-f0fc4b1aa306-snapshots\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.407256 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.407186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611abd7b-ca37-43e2-a480-f0fc4b1aa306-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.407488 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.407470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7676326c-368f-406c-a545-55cebded1f2b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.415036 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.415008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7j6\" (UniqueName: \"kubernetes.io/projected/7676326c-368f-406c-a545-55cebded1f2b-kube-api-access-4f7j6\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.508102 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611abd7b-ca37-43e2-a480-f0fc4b1aa306-service-ca-bundle\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508102 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/611abd7b-ca37-43e2-a480-f0fc4b1aa306-snapshots\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508102 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611abd7b-ca37-43e2-a480-f0fc4b1aa306-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508376 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxwgz\" (UniqueName: \"kubernetes.io/projected/611abd7b-ca37-43e2-a480-f0fc4b1aa306-kube-api-access-pxwgz\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508376 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/611abd7b-ca37-43e2-a480-f0fc4b1aa306-tmp\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508376 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508359 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611abd7b-ca37-43e2-a480-f0fc4b1aa306-serving-cert\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508755 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508685 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611abd7b-ca37-43e2-a480-f0fc4b1aa306-service-ca-bundle\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508755 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/611abd7b-ca37-43e2-a480-f0fc4b1aa306-snapshots\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508755 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508685 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/611abd7b-ca37-43e2-a480-f0fc4b1aa306-tmp\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.508925 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.508907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611abd7b-ca37-43e2-a480-f0fc4b1aa306-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.510610 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.510591 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611abd7b-ca37-43e2-a480-f0fc4b1aa306-serving-cert\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.516435 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.516414 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxwgz\" (UniqueName: \"kubernetes.io/projected/611abd7b-ca37-43e2-a480-f0fc4b1aa306-kube-api-access-pxwgz\") pod \"insights-operator-585dfdc468-k6554\" (UID: \"611abd7b-ca37-43e2-a480-f0fc4b1aa306\") " pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.612751 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.612713 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-k6554" Apr 22 17:36:23.742853 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.742821 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k6554"] Apr 22 17:36:23.746065 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:36:23.746039 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611abd7b_ca37_43e2_a480_f0fc4b1aa306.slice/crio-b9ac500a56b7318170444c32232b47ee8659266c1193eca138831a0090a0c7dd WatchSource:0}: Error finding container b9ac500a56b7318170444c32232b47ee8659266c1193eca138831a0090a0c7dd: Status 404 returned error can't find the container with id b9ac500a56b7318170444c32232b47ee8659266c1193eca138831a0090a0c7dd Apr 22 17:36:23.911277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:23.911239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:23.911433 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:23.911391 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:23.911476 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:23.911471 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls podName:7676326c-368f-406c-a545-55cebded1f2b nodeName:}" failed. No retries permitted until 2026-04-22 17:36:24.911454323 +0000 UTC m=+118.390710982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-knbfq" (UID: "7676326c-368f-406c-a545-55cebded1f2b") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:24.459407 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:24.459368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k6554" event={"ID":"611abd7b-ca37-43e2-a480-f0fc4b1aa306","Type":"ContainerStarted","Data":"b9ac500a56b7318170444c32232b47ee8659266c1193eca138831a0090a0c7dd"} Apr 22 17:36:24.918529 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:24.918491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:24.918756 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:24.918660 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:24.918838 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:24.918760 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls podName:7676326c-368f-406c-a545-55cebded1f2b nodeName:}" failed. No retries permitted until 2026-04-22 17:36:26.918734334 +0000 UTC m=+120.397991004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-knbfq" (UID: "7676326c-368f-406c-a545-55cebded1f2b") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:26.464918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:26.464867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k6554" event={"ID":"611abd7b-ca37-43e2-a480-f0fc4b1aa306","Type":"ContainerStarted","Data":"b27d4df6740cebfd169fc5c555abd9225586edc21a10f7717a5bda3149e9005d"} Apr 22 17:36:26.485654 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:26.485609 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-k6554" podStartSLOduration=1.615935784 podStartE2EDuration="3.485595401s" podCreationTimestamp="2026-04-22 17:36:23 +0000 UTC" firstStartedPulling="2026-04-22 17:36:23.747758981 +0000 UTC m=+117.227015641" lastFinishedPulling="2026-04-22 17:36:25.617418595 +0000 UTC m=+119.096675258" observedRunningTime="2026-04-22 17:36:26.4852554 +0000 UTC m=+119.964512095" watchObservedRunningTime="2026-04-22 17:36:26.485595401 +0000 UTC m=+119.964852082" Apr 22 17:36:26.936160 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:26.936062 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:26.936344 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:26.936263 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:26.936344 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:26.936329 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls podName:7676326c-368f-406c-a545-55cebded1f2b nodeName:}" failed. No retries permitted until 2026-04-22 17:36:30.936314161 +0000 UTC m=+124.415570822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-knbfq" (UID: "7676326c-368f-406c-a545-55cebded1f2b") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:27.961553 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:27.961527 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4nmbk_0e0b96d6-6ddc-4b33-8123-2eec86f21a66/dns-node-resolver/0.log" Apr 22 17:36:28.761204 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:28.761179 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7q8mm_2f76cec7-7e9b-4a76-a5c5-c12f9790bb38/node-ca/0.log" Apr 22 17:36:29.192130 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.192094 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm"] Apr 22 17:36:29.195289 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.195272 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:29.197442 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.197412 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:36:29.197565 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.197523 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 17:36:29.198313 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.198297 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 17:36:29.198394 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.198362 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9hg7m\"" Apr 22 17:36:29.203135 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.203113 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm"] Apr 22 17:36:29.255168 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.255132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:29.255353 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.255198 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prt88\" (UniqueName: \"kubernetes.io/projected/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-kube-api-access-prt88\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:29.355521 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.355491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prt88\" (UniqueName: \"kubernetes.io/projected/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-kube-api-access-prt88\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:29.355723 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.355553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:29.355723 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:29.355646 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:36:29.355723 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:29.355726 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls podName:11cf0c6e-12cc-4c1a-9e4e-5487377580a7 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:29.855688419 +0000 UTC m=+123.334945078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q5qcm" (UID: "11cf0c6e-12cc-4c1a-9e4e-5487377580a7") : secret "samples-operator-tls" not found Apr 22 17:36:29.363995 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.363968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prt88\" (UniqueName: \"kubernetes.io/projected/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-kube-api-access-prt88\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:29.859918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:29.859880 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:29.860099 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:29.860027 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:36:29.860146 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:29.860113 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls podName:11cf0c6e-12cc-4c1a-9e4e-5487377580a7 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:30.860098222 +0000 UTC m=+124.339354882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q5qcm" (UID: "11cf0c6e-12cc-4c1a-9e4e-5487377580a7") : secret "samples-operator-tls" not found Apr 22 17:36:30.181911 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.181881 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gpsv4"] Apr 22 17:36:30.184630 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.184615 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.186807 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.186787 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 17:36:30.186910 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.186808 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:36:30.187051 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.187035 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 17:36:30.187124 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.187108 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-br8cf\"" Apr 22 17:36:30.187323 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.187307 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 17:36:30.193370 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.193349 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gpsv4"] Apr 22 17:36:30.194367 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.194225 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 17:36:30.263036 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.263001 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbgf\" (UniqueName: \"kubernetes.io/projected/31b62473-ecf9-4e50-926e-50d8d8ca9231-kube-api-access-9vbgf\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.263228 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.263061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31b62473-ecf9-4e50-926e-50d8d8ca9231-trusted-ca\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.263228 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.263158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b62473-ecf9-4e50-926e-50d8d8ca9231-config\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.263228 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.263220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31b62473-ecf9-4e50-926e-50d8d8ca9231-serving-cert\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.364192 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.364157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbgf\" (UniqueName: \"kubernetes.io/projected/31b62473-ecf9-4e50-926e-50d8d8ca9231-kube-api-access-9vbgf\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.364362 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.364203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31b62473-ecf9-4e50-926e-50d8d8ca9231-trusted-ca\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.364407 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.364361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b62473-ecf9-4e50-926e-50d8d8ca9231-config\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.364456 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.364417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31b62473-ecf9-4e50-926e-50d8d8ca9231-serving-cert\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.364977 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.364953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b62473-ecf9-4e50-926e-50d8d8ca9231-config\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.365084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.364956 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31b62473-ecf9-4e50-926e-50d8d8ca9231-trusted-ca\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.366681 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.366663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31b62473-ecf9-4e50-926e-50d8d8ca9231-serving-cert\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.372802 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.372774 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbgf\" (UniqueName: \"kubernetes.io/projected/31b62473-ecf9-4e50-926e-50d8d8ca9231-kube-api-access-9vbgf\") pod \"console-operator-9d4b6777b-gpsv4\" (UID: \"31b62473-ecf9-4e50-926e-50d8d8ca9231\") " pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.496410 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.496317 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:30.606343 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.606313 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gpsv4"] Apr 22 17:36:30.609489 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:36:30.609459 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b62473_ecf9_4e50_926e_50d8d8ca9231.slice/crio-2a1a95bd438075082e03a1e3934d0e6776f3ff3cc25592b6b6ef9b0c0a87493c WatchSource:0}: Error finding container 2a1a95bd438075082e03a1e3934d0e6776f3ff3cc25592b6b6ef9b0c0a87493c: Status 404 returned error can't find the container with id 2a1a95bd438075082e03a1e3934d0e6776f3ff3cc25592b6b6ef9b0c0a87493c Apr 22 17:36:30.868253 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.868158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:30.868394 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:30.868304 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:36:30.868394 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:30.868371 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls podName:11cf0c6e-12cc-4c1a-9e4e-5487377580a7 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:32.868355526 +0000 UTC m=+126.347612191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q5qcm" (UID: "11cf0c6e-12cc-4c1a-9e4e-5487377580a7") : secret "samples-operator-tls" not found Apr 22 17:36:30.968850 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:30.968808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:30.969018 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:30.968963 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:30.969054 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:30.969038 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls podName:7676326c-368f-406c-a545-55cebded1f2b nodeName:}" failed. No retries permitted until 2026-04-22 17:36:38.96902284 +0000 UTC m=+132.448279500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-knbfq" (UID: "7676326c-368f-406c-a545-55cebded1f2b") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:31.247408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.247373 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d6bdb5fd9-bg7x4"] Apr 22 17:36:31.250448 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.250424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.253159 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.252876 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:36:31.253159 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.253008 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:36:31.253159 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.253060 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:36:31.253159 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.253159 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rmqdq\"" Apr 22 17:36:31.259624 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.259603 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:36:31.260323 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.260280 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d6bdb5fd9-bg7x4"] Apr 22 17:36:31.371720 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.371670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-image-registry-private-configuration\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.371909 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.371734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.371909 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.371822 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-trusted-ca\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.371909 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.371873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/579a3ebc-9a66-4781-8220-a94c0b7a620a-ca-trust-extracted\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.371909 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.371899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-installation-pull-secrets\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.372126 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.371920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-bound-sa-token\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.372126 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.371968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865ls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-kube-api-access-865ls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.372126 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.372037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-certificates\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.473323 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-certificates\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.473512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-image-registry-private-configuration\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.473512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.473512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-trusted-ca\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.473512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/579a3ebc-9a66-4781-8220-a94c0b7a620a-ca-trust-extracted\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.473512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-installation-pull-secrets\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.473512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-bound-sa-token\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.473831 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-865ls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-kube-api-access-865ls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.474109 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:31.473948 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:36:31.474109 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.473956 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-certificates\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.474109 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:31.473969 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d6bdb5fd9-bg7x4: secret "image-registry-tls" not found Apr 22 17:36:31.474109 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:31.474036 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls podName:579a3ebc-9a66-4781-8220-a94c0b7a620a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:31.974017763 +0000 UTC m=+125.453274429 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls") pod "image-registry-d6bdb5fd9-bg7x4" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a") : secret "image-registry-tls" not found Apr 22 17:36:31.474403 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.474173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/579a3ebc-9a66-4781-8220-a94c0b7a620a-ca-trust-extracted\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.474726 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.474663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-trusted-ca\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.476508 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.476465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-installation-pull-secrets\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.476672 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.476643 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" event={"ID":"31b62473-ecf9-4e50-926e-50d8d8ca9231","Type":"ContainerStarted","Data":"2a1a95bd438075082e03a1e3934d0e6776f3ff3cc25592b6b6ef9b0c0a87493c"} Apr 22 17:36:31.476672 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.476667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-image-registry-private-configuration\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.482535 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.482495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-bound-sa-token\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.482686 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.482662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-865ls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-kube-api-access-865ls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.978121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:31.978082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:31.978300 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:31.978254 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:36:31.978300 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:31.978274 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d6bdb5fd9-bg7x4: secret "image-registry-tls" not found Apr 22 17:36:31.978391 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:31.978345 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls podName:579a3ebc-9a66-4781-8220-a94c0b7a620a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:32.978321578 +0000 UTC m=+126.457578261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls") pod "image-registry-d6bdb5fd9-bg7x4" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a") : secret "image-registry-tls" not found Apr 22 17:36:32.479725 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:32.479665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" event={"ID":"31b62473-ecf9-4e50-926e-50d8d8ca9231","Type":"ContainerStarted","Data":"fa67bad82c2bb3381a84a93142ed25065ef103062ddaabf710b311914849fc1a"} Apr 22 17:36:32.480162 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:32.479828 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:32.481492 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:32.481298 2572 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-gpsv4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.10:8443/readyz\": dial tcp 10.134.0.10:8443: connect: connection refused" start-of-body= Apr 22 17:36:32.481492 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:32.481340 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.10:8443/readyz\": dial tcp 10.134.0.10:8443: connect: connection refused" Apr 22 17:36:32.494128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:32.494081 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podStartSLOduration=0.727145605 podStartE2EDuration="2.49406729s" podCreationTimestamp="2026-04-22 17:36:30 +0000 UTC" firstStartedPulling="2026-04-22 17:36:30.611307836 +0000 UTC m=+124.090564496" lastFinishedPulling="2026-04-22 17:36:32.378229518 +0000 UTC m=+125.857486181" observedRunningTime="2026-04-22 17:36:32.493798269 +0000 UTC m=+125.973054952" watchObservedRunningTime="2026-04-22 17:36:32.49406729 +0000 UTC m=+125.973323972" Apr 22 17:36:32.884533 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:32.884492 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:32.884734 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:32.884625 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:36:32.884734 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:32.884689 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls podName:11cf0c6e-12cc-4c1a-9e4e-5487377580a7 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:36.884674768 +0000 UTC m=+130.363931428 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q5qcm" (UID: "11cf0c6e-12cc-4c1a-9e4e-5487377580a7") : secret "samples-operator-tls" not found Apr 22 17:36:32.985026 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:32.984990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:32.985190 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:32.985167 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:36:32.985230 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:32.985193 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d6bdb5fd9-bg7x4: secret "image-registry-tls" not found Apr 22 17:36:32.985301 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:32.985291 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls podName:579a3ebc-9a66-4781-8220-a94c0b7a620a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:34.985273091 +0000 UTC m=+128.464529751 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls") pod "image-registry-d6bdb5fd9-bg7x4" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a") : secret "image-registry-tls" not found Apr 22 17:36:33.482738 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:33.482691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/0.log" Apr 22 17:36:33.483075 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:33.482749 2572 generic.go:358] "Generic (PLEG): container finished" podID="31b62473-ecf9-4e50-926e-50d8d8ca9231" containerID="fa67bad82c2bb3381a84a93142ed25065ef103062ddaabf710b311914849fc1a" exitCode=255 Apr 22 17:36:33.483075 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:33.482787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" event={"ID":"31b62473-ecf9-4e50-926e-50d8d8ca9231","Type":"ContainerDied","Data":"fa67bad82c2bb3381a84a93142ed25065ef103062ddaabf710b311914849fc1a"} Apr 22 17:36:33.483075 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:33.483063 2572 scope.go:117] "RemoveContainer" containerID="fa67bad82c2bb3381a84a93142ed25065ef103062ddaabf710b311914849fc1a" Apr 22 17:36:34.486352 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:34.486326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/1.log" Apr 22 17:36:34.486803 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:34.486659 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/0.log" Apr 22 17:36:34.486803 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:34.486688 2572 generic.go:358] "Generic (PLEG): container finished" podID="31b62473-ecf9-4e50-926e-50d8d8ca9231" containerID="0091f1b6a142a163b337ed6a9013574222552782495c60ad632e622b3f4c3afa" exitCode=255 Apr 22 17:36:34.486803 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:34.486735 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" event={"ID":"31b62473-ecf9-4e50-926e-50d8d8ca9231","Type":"ContainerDied","Data":"0091f1b6a142a163b337ed6a9013574222552782495c60ad632e622b3f4c3afa"} Apr 22 17:36:34.486803 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:34.486773 2572 scope.go:117] "RemoveContainer" containerID="fa67bad82c2bb3381a84a93142ed25065ef103062ddaabf710b311914849fc1a" Apr 22 17:36:34.487071 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:34.487050 2572 scope.go:117] "RemoveContainer" containerID="0091f1b6a142a163b337ed6a9013574222552782495c60ad632e622b3f4c3afa" Apr 22 17:36:34.487275 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:34.487257 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gpsv4_openshift-console-operator(31b62473-ecf9-4e50-926e-50d8d8ca9231)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" Apr 22 17:36:35.002384 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:35.002345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:35.002589 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:35.002505 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:36:35.002589 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:35.002528 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d6bdb5fd9-bg7x4: secret "image-registry-tls" not found Apr 22 17:36:35.002728 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:35.002602 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls podName:579a3ebc-9a66-4781-8220-a94c0b7a620a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:39.002581874 +0000 UTC m=+132.481838551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls") pod "image-registry-d6bdb5fd9-bg7x4" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a") : secret "image-registry-tls" not found Apr 22 17:36:35.489693 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:35.489664 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/1.log" Apr 22 17:36:35.490171 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:35.490071 2572 scope.go:117] "RemoveContainer" containerID="0091f1b6a142a163b337ed6a9013574222552782495c60ad632e622b3f4c3afa" Apr 22 17:36:35.490251 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:35.490234 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gpsv4_openshift-console-operator(31b62473-ecf9-4e50-926e-50d8d8ca9231)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" Apr 22 17:36:36.816684 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:36.816627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:36:36.817088 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:36.816792 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:36:36.817088 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:36.816857 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs podName:34c7625b-b71f-4d8d-a883-c465098dbba7 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:38.816840423 +0000 UTC m=+252.296097084 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs") pod "network-metrics-daemon-djttm" (UID: "34c7625b-b71f-4d8d-a883-c465098dbba7") : secret "metrics-daemon-secret" not found Apr 22 17:36:36.917208 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:36.917175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:36.917357 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:36.917312 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:36:36.917394 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:36.917374 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls podName:11cf0c6e-12cc-4c1a-9e4e-5487377580a7 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:44.91735907 +0000 UTC m=+138.396615731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q5qcm" (UID: "11cf0c6e-12cc-4c1a-9e4e-5487377580a7") : secret "samples-operator-tls" not found Apr 22 17:36:37.206030 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.205987 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf"] Apr 22 17:36:37.210078 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.210056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf" Apr 22 17:36:37.212728 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.212711 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-888mj\"" Apr 22 17:36:37.219384 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.219363 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf"] Apr 22 17:36:37.321027 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.320971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78rk2\" (UniqueName: \"kubernetes.io/projected/f3b152e4-0668-4dd6-9f03-c910a1ce0561-kube-api-access-78rk2\") pod \"network-check-source-8894fc9bd-pwhxf\" (UID: \"f3b152e4-0668-4dd6-9f03-c910a1ce0561\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf" Apr 22 17:36:37.422257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.422206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78rk2\" (UniqueName: \"kubernetes.io/projected/f3b152e4-0668-4dd6-9f03-c910a1ce0561-kube-api-access-78rk2\") pod \"network-check-source-8894fc9bd-pwhxf\" (UID: \"f3b152e4-0668-4dd6-9f03-c910a1ce0561\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf" Apr 22 17:36:37.430822 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.430789 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78rk2\" (UniqueName: \"kubernetes.io/projected/f3b152e4-0668-4dd6-9f03-c910a1ce0561-kube-api-access-78rk2\") pod \"network-check-source-8894fc9bd-pwhxf\" (UID: \"f3b152e4-0668-4dd6-9f03-c910a1ce0561\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf" Apr 22 17:36:37.519212 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.519121 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf" Apr 22 17:36:37.629233 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:37.629202 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf"] Apr 22 17:36:37.632255 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:36:37.632221 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3b152e4_0668_4dd6_9f03_c910a1ce0561.slice/crio-7d064900823c285869578d56cc88b1849801a510da86f6451d024178387043a0 WatchSource:0}: Error finding container 7d064900823c285869578d56cc88b1849801a510da86f6451d024178387043a0: Status 404 returned error can't find the container with id 7d064900823c285869578d56cc88b1849801a510da86f6451d024178387043a0 Apr 22 17:36:38.497967 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:38.497926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf" event={"ID":"f3b152e4-0668-4dd6-9f03-c910a1ce0561","Type":"ContainerStarted","Data":"2da95b6f8f3456bf706264de5c147cd6013b1c75b299c2886c387ece01463395"} Apr 22 17:36:38.497967 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:38.497968 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf" event={"ID":"f3b152e4-0668-4dd6-9f03-c910a1ce0561","Type":"ContainerStarted","Data":"7d064900823c285869578d56cc88b1849801a510da86f6451d024178387043a0"} Apr 22 17:36:38.513782 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:38.513736 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwhxf" podStartSLOduration=1.513719503 podStartE2EDuration="1.513719503s" podCreationTimestamp="2026-04-22 17:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:36:38.512913763 +0000 UTC m=+131.992170445" watchObservedRunningTime="2026-04-22 17:36:38.513719503 +0000 UTC m=+131.992976184" Apr 22 17:36:39.036113 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.036078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:39.036293 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.036154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:39.036293 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:39.036221 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:36:39.036293 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:39.036245 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d6bdb5fd9-bg7x4: secret "image-registry-tls" not found Apr 22 17:36:39.036293 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:39.036246 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:39.036420 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:39.036301 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls podName:579a3ebc-9a66-4781-8220-a94c0b7a620a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:47.036283142 +0000 UTC m=+140.515539806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls") pod "image-registry-d6bdb5fd9-bg7x4" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a") : secret "image-registry-tls" not found Apr 22 17:36:39.036420 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:39.036316 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls podName:7676326c-368f-406c-a545-55cebded1f2b nodeName:}" failed. No retries permitted until 2026-04-22 17:36:55.036309693 +0000 UTC m=+148.515566354 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-knbfq" (UID: "7676326c-368f-406c-a545-55cebded1f2b") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:36:39.037765 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.037740 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x"] Apr 22 17:36:39.041654 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.041555 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" Apr 22 17:36:39.044174 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.044152 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 17:36:39.044879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.044859 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 17:36:39.044879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.044872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-c8htn\"" Apr 22 17:36:39.050267 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.050245 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x"] Apr 22 17:36:39.136935 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.136899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zptvl\" (UniqueName: \"kubernetes.io/projected/306bad3b-30a1-40bf-9575-15a852186090-kube-api-access-zptvl\") pod \"migrator-74bb7799d9-r4g8x\" (UID: \"306bad3b-30a1-40bf-9575-15a852186090\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" Apr 22 17:36:39.237314 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.237279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zptvl\" (UniqueName: \"kubernetes.io/projected/306bad3b-30a1-40bf-9575-15a852186090-kube-api-access-zptvl\") pod \"migrator-74bb7799d9-r4g8x\" (UID: \"306bad3b-30a1-40bf-9575-15a852186090\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" Apr 22 17:36:39.245197 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.245170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zptvl\" (UniqueName: \"kubernetes.io/projected/306bad3b-30a1-40bf-9575-15a852186090-kube-api-access-zptvl\") pod \"migrator-74bb7799d9-r4g8x\" (UID: \"306bad3b-30a1-40bf-9575-15a852186090\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" Apr 22 17:36:39.350665 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.350581 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" Apr 22 17:36:39.465253 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.465194 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x"] Apr 22 17:36:39.467567 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:36:39.467539 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod306bad3b_30a1_40bf_9575_15a852186090.slice/crio-5ddecf85c2165c5c999ff01d79156bc22a6dbd3b1cebd2b6f53dabdd1d33dedc WatchSource:0}: Error finding container 5ddecf85c2165c5c999ff01d79156bc22a6dbd3b1cebd2b6f53dabdd1d33dedc: Status 404 returned error can't find the container with id 5ddecf85c2165c5c999ff01d79156bc22a6dbd3b1cebd2b6f53dabdd1d33dedc Apr 22 17:36:39.501023 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.500993 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" event={"ID":"306bad3b-30a1-40bf-9575-15a852186090","Type":"ContainerStarted","Data":"5ddecf85c2165c5c999ff01d79156bc22a6dbd3b1cebd2b6f53dabdd1d33dedc"} Apr 22 17:36:39.624616 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.624543 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xq75n"] Apr 22 17:36:39.628755 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.628734 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.630993 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.630976 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:36:39.631439 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.631420 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-c22fx\"" Apr 22 17:36:39.631536 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.631520 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:36:39.638604 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.638584 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xq75n"] Apr 22 17:36:39.740416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.740383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5dab7bfc-8f04-4d6d-816b-38e69e040297-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.740588 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.740430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5dab7bfc-8f04-4d6d-816b-38e69e040297-data-volume\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.740588 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.740448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp56t\" (UniqueName: \"kubernetes.io/projected/5dab7bfc-8f04-4d6d-816b-38e69e040297-kube-api-access-zp56t\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.740588 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.740527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.740687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.740616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5dab7bfc-8f04-4d6d-816b-38e69e040297-crio-socket\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.841755 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.841720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5dab7bfc-8f04-4d6d-816b-38e69e040297-crio-socket\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.841892 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.841790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5dab7bfc-8f04-4d6d-816b-38e69e040297-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.841892 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.841819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5dab7bfc-8f04-4d6d-816b-38e69e040297-data-volume\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.841892 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.841838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zp56t\" (UniqueName: \"kubernetes.io/projected/5dab7bfc-8f04-4d6d-816b-38e69e040297-kube-api-access-zp56t\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.841892 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.841847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5dab7bfc-8f04-4d6d-816b-38e69e040297-crio-socket\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.841892 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.841860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.842141 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:39.841952 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:39.842141 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:39.842013 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls podName:5dab7bfc-8f04-4d6d-816b-38e69e040297 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:40.341996578 +0000 UTC m=+133.821253252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xq75n" (UID: "5dab7bfc-8f04-4d6d-816b-38e69e040297") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:39.842222 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.842200 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5dab7bfc-8f04-4d6d-816b-38e69e040297-data-volume\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.842842 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.842826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5dab7bfc-8f04-4d6d-816b-38e69e040297-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:39.849876 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:39.849859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp56t\" (UniqueName: \"kubernetes.io/projected/5dab7bfc-8f04-4d6d-816b-38e69e040297-kube-api-access-zp56t\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:40.346823 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:40.346771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:40.347004 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:40.346927 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:40.347004 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:40.346997 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls podName:5dab7bfc-8f04-4d6d-816b-38e69e040297 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:41.346978614 +0000 UTC m=+134.826235274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xq75n" (UID: "5dab7bfc-8f04-4d6d-816b-38e69e040297") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:40.496958 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:40.496921 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:40.497350 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:40.497329 2572 scope.go:117] "RemoveContainer" containerID="0091f1b6a142a163b337ed6a9013574222552782495c60ad632e622b3f4c3afa" Apr 22 17:36:40.497527 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:40.497508 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gpsv4_openshift-console-operator(31b62473-ecf9-4e50-926e-50d8d8ca9231)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" Apr 22 17:36:41.355676 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:41.355634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:41.356199 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:41.355852 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:41.356199 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:41.355937 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls podName:5dab7bfc-8f04-4d6d-816b-38e69e040297 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:43.355915144 +0000 UTC m=+136.835171804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xq75n" (UID: "5dab7bfc-8f04-4d6d-816b-38e69e040297") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:41.507298 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:41.507262 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" event={"ID":"306bad3b-30a1-40bf-9575-15a852186090","Type":"ContainerStarted","Data":"6830c363a2745f49548cf8c174f563a59f0541aed1648130d4ae498abe8efc34"} Apr 22 17:36:41.507298 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:41.507299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" event={"ID":"306bad3b-30a1-40bf-9575-15a852186090","Type":"ContainerStarted","Data":"90c7e69e8f63cacbf3aee003f531ec04cf4af76f61a2b47a3897cb21d49ebde0"} Apr 22 17:36:41.528530 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:41.528484 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-r4g8x" podStartSLOduration=1.2653819419999999 podStartE2EDuration="2.528471351s" podCreationTimestamp="2026-04-22 17:36:39 +0000 UTC" firstStartedPulling="2026-04-22 17:36:39.469463722 +0000 UTC m=+132.948720382" lastFinishedPulling="2026-04-22 17:36:40.732553128 +0000 UTC m=+134.211809791" observedRunningTime="2026-04-22 17:36:41.527447001 +0000 UTC m=+135.006703684" watchObservedRunningTime="2026-04-22 17:36:41.528471351 +0000 UTC m=+135.007728033" Apr 22 17:36:42.480189 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:42.480149 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:36:42.480571 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:42.480531 2572 scope.go:117] "RemoveContainer" containerID="0091f1b6a142a163b337ed6a9013574222552782495c60ad632e622b3f4c3afa" Apr 22 17:36:42.480739 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:42.480680 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gpsv4_openshift-console-operator(31b62473-ecf9-4e50-926e-50d8d8ca9231)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" Apr 22 17:36:43.372672 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:43.372633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:43.372896 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:43.372821 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:43.372968 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:43.372903 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls podName:5dab7bfc-8f04-4d6d-816b-38e69e040297 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:47.372881745 +0000 UTC m=+140.852138405 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xq75n" (UID: "5dab7bfc-8f04-4d6d-816b-38e69e040297") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:44.986230 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:44.986190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:44.988526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:44.988504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11cf0c6e-12cc-4c1a-9e4e-5487377580a7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q5qcm\" (UID: \"11cf0c6e-12cc-4c1a-9e4e-5487377580a7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:45.104876 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:45.104847 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" Apr 22 17:36:45.223358 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:45.223324 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm"] Apr 22 17:36:45.519734 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:45.519628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" event={"ID":"11cf0c6e-12cc-4c1a-9e4e-5487377580a7","Type":"ContainerStarted","Data":"e016af18b2d586670a2695f22dd391a4a96df7a0681e60a0af6af1d0c9cbfec4"} Apr 22 17:36:47.104478 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.104451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:47.106607 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.106590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"image-registry-d6bdb5fd9-bg7x4\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:47.161354 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.161322 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:47.292060 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.292025 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d6bdb5fd9-bg7x4"] Apr 22 17:36:47.295613 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:36:47.295588 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod579a3ebc_9a66_4781_8220_a94c0b7a620a.slice/crio-2eba8b9e24a161a17f2e4c79780d6e7ec309b515042bd1400862323175bf55e9 WatchSource:0}: Error finding container 2eba8b9e24a161a17f2e4c79780d6e7ec309b515042bd1400862323175bf55e9: Status 404 returned error can't find the container with id 2eba8b9e24a161a17f2e4c79780d6e7ec309b515042bd1400862323175bf55e9 Apr 22 17:36:47.407086 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.407050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:47.409315 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.409288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5dab7bfc-8f04-4d6d-816b-38e69e040297-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xq75n\" (UID: \"5dab7bfc-8f04-4d6d-816b-38e69e040297\") " pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:47.437731 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.437684 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xq75n" Apr 22 17:36:47.526093 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.526052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" event={"ID":"11cf0c6e-12cc-4c1a-9e4e-5487377580a7","Type":"ContainerStarted","Data":"d432b0c98506cbec581b2673642c57b896a2cc7b042990db01da32d9ada9ceba"} Apr 22 17:36:47.526283 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.526101 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" event={"ID":"11cf0c6e-12cc-4c1a-9e4e-5487377580a7","Type":"ContainerStarted","Data":"9bd8228eae9e515e98a8e6e8e532d8edc583d4ca6a614222c8dd2ea8a7e17d45"} Apr 22 17:36:47.528763 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.528732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" event={"ID":"579a3ebc-9a66-4781-8220-a94c0b7a620a","Type":"ContainerStarted","Data":"6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347"} Apr 22 17:36:47.528922 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.528904 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" event={"ID":"579a3ebc-9a66-4781-8220-a94c0b7a620a","Type":"ContainerStarted","Data":"2eba8b9e24a161a17f2e4c79780d6e7ec309b515042bd1400862323175bf55e9"} Apr 22 17:36:47.529498 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.529476 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:36:47.542577 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.542529 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q5qcm" podStartSLOduration=16.727511076 podStartE2EDuration="18.542513074s" podCreationTimestamp="2026-04-22 17:36:29 +0000 UTC" firstStartedPulling="2026-04-22 17:36:45.269119278 +0000 UTC m=+138.748375938" lastFinishedPulling="2026-04-22 17:36:47.08412126 +0000 UTC m=+140.563377936" observedRunningTime="2026-04-22 17:36:47.542283069 +0000 UTC m=+141.021539776" watchObservedRunningTime="2026-04-22 17:36:47.542513074 +0000 UTC m=+141.021769756" Apr 22 17:36:47.557851 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.557818 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xq75n"] Apr 22 17:36:47.562069 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:47.561741 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" podStartSLOduration=16.56172659 podStartE2EDuration="16.56172659s" podCreationTimestamp="2026-04-22 17:36:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:36:47.561368788 +0000 UTC m=+141.040625472" watchObservedRunningTime="2026-04-22 17:36:47.56172659 +0000 UTC m=+141.040983275" Apr 22 17:36:47.562069 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:36:47.561971 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dab7bfc_8f04_4d6d_816b_38e69e040297.slice/crio-ea2709d06a616804b2f853be8f16dfd092697f1c152cde7f5a7bf5952345a013 WatchSource:0}: Error finding container ea2709d06a616804b2f853be8f16dfd092697f1c152cde7f5a7bf5952345a013: Status 404 returned error can't find the container with id ea2709d06a616804b2f853be8f16dfd092697f1c152cde7f5a7bf5952345a013 Apr 22 17:36:48.533645 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:48.533609 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xq75n" event={"ID":"5dab7bfc-8f04-4d6d-816b-38e69e040297","Type":"ContainerStarted","Data":"ff258695e64c7d9e479cd601161e1907fdeacdaac33b4ca99ed55dad6588f8ae"} Apr 22 17:36:48.533645 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:48.533650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xq75n" event={"ID":"5dab7bfc-8f04-4d6d-816b-38e69e040297","Type":"ContainerStarted","Data":"726879e02033a844f63a89b024e388c4942455c99b865eaae10fc1aeb6c5c0b0"} Apr 22 17:36:48.534063 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:48.533663 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xq75n" event={"ID":"5dab7bfc-8f04-4d6d-816b-38e69e040297","Type":"ContainerStarted","Data":"ea2709d06a616804b2f853be8f16dfd092697f1c152cde7f5a7bf5952345a013"} Apr 22 17:36:50.540813 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:50.540772 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xq75n" event={"ID":"5dab7bfc-8f04-4d6d-816b-38e69e040297","Type":"ContainerStarted","Data":"4a24fc3f0d86637573c2d8289b7c5b047bff359ce2b8764a16f5ee618576c27e"} Apr 22 17:36:50.558146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:50.558104 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xq75n" podStartSLOduration=9.34342728 podStartE2EDuration="11.558090648s" podCreationTimestamp="2026-04-22 17:36:39 +0000 UTC" firstStartedPulling="2026-04-22 17:36:47.621801322 +0000 UTC m=+141.101057998" lastFinishedPulling="2026-04-22 17:36:49.836464692 +0000 UTC m=+143.315721366" observedRunningTime="2026-04-22 17:36:50.556918119 +0000 UTC m=+144.036174825" watchObservedRunningTime="2026-04-22 17:36:50.558090648 +0000 UTC m=+144.037347330" Apr 22 17:36:55.067386 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.067343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:55.069862 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.069827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7676326c-368f-406c-a545-55cebded1f2b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-knbfq\" (UID: \"7676326c-368f-406c-a545-55cebded1f2b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:55.094639 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.094614 2572 scope.go:117] "RemoveContainer" containerID="0091f1b6a142a163b337ed6a9013574222552782495c60ad632e622b3f4c3afa" Apr 22 17:36:55.308269 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.308238 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-k5749\"" Apr 22 17:36:55.316930 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.316890 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" Apr 22 17:36:55.436111 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.436080 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq"] Apr 22 17:36:55.439064 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:36:55.439031 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7676326c_368f_406c_a545_55cebded1f2b.slice/crio-4b7e3d98150fabd7e55566ea3eb69901361ac599848519e527d39c1b81c9fc6f WatchSource:0}: Error finding container 4b7e3d98150fabd7e55566ea3eb69901361ac599848519e527d39c1b81c9fc6f: Status 404 returned error can't find the container with id 4b7e3d98150fabd7e55566ea3eb69901361ac599848519e527d39c1b81c9fc6f Apr 22 17:36:55.553602 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.553565 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" event={"ID":"7676326c-368f-406c-a545-55cebded1f2b","Type":"ContainerStarted","Data":"4b7e3d98150fabd7e55566ea3eb69901361ac599848519e527d39c1b81c9fc6f"} Apr 22 17:36:55.554842 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.554822 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:36:55.555196 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.555183 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/1.log" Apr 22 17:36:55.555253 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.555215 2572 generic.go:358] "Generic (PLEG): container finished" podID="31b62473-ecf9-4e50-926e-50d8d8ca9231" containerID="5490afec7a733faa6600f0ad18a59671b8699bc70b2294e68b3c3011f06195de" exitCode=255 Apr 22 17:36:55.555289 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.555250 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" event={"ID":"31b62473-ecf9-4e50-926e-50d8d8ca9231","Type":"ContainerDied","Data":"5490afec7a733faa6600f0ad18a59671b8699bc70b2294e68b3c3011f06195de"} Apr 22 17:36:55.555289 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.555277 2572 scope.go:117] "RemoveContainer" containerID="0091f1b6a142a163b337ed6a9013574222552782495c60ad632e622b3f4c3afa" Apr 22 17:36:55.555623 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:55.555610 2572 scope.go:117] "RemoveContainer" containerID="5490afec7a733faa6600f0ad18a59671b8699bc70b2294e68b3c3011f06195de" Apr 22 17:36:55.555840 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:36:55.555813 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gpsv4_openshift-console-operator(31b62473-ecf9-4e50-926e-50d8d8ca9231)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" Apr 22 17:36:56.559256 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:56.559229 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:36:57.563248 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:57.563163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" event={"ID":"7676326c-368f-406c-a545-55cebded1f2b","Type":"ContainerStarted","Data":"c2d45d819d2d8d05bd3c468194ca3985f50fc743113f4cc459e113c9cf67b867"} Apr 22 17:36:57.580513 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:57.580451 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-knbfq" podStartSLOduration=32.87483101 podStartE2EDuration="34.580435168s" podCreationTimestamp="2026-04-22 17:36:23 +0000 UTC" firstStartedPulling="2026-04-22 17:36:55.440865519 +0000 UTC m=+148.920122180" lastFinishedPulling="2026-04-22 17:36:57.146469664 +0000 UTC m=+150.625726338" observedRunningTime="2026-04-22 17:36:57.579530375 +0000 UTC m=+151.058787060" watchObservedRunningTime="2026-04-22 17:36:57.580435168 +0000 UTC m=+151.059691854" Apr 22 17:36:59.730502 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:36:59.730469 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d6bdb5fd9-bg7x4"] Apr 22 17:37:00.496820 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:00.496778 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:37:00.497216 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:00.497200 2572 scope.go:117] "RemoveContainer" containerID="5490afec7a733faa6600f0ad18a59671b8699bc70b2294e68b3c3011f06195de" Apr 22 17:37:00.497431 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:00.497411 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gpsv4_openshift-console-operator(31b62473-ecf9-4e50-926e-50d8d8ca9231)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" Apr 22 17:37:02.479987 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:02.479945 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:37:02.480381 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:02.480318 2572 scope.go:117] "RemoveContainer" containerID="5490afec7a733faa6600f0ad18a59671b8699bc70b2294e68b3c3011f06195de" Apr 22 17:37:02.480483 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:02.480466 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gpsv4_openshift-console-operator(31b62473-ecf9-4e50-926e-50d8d8ca9231)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" Apr 22 17:37:03.927616 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:03.927559 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-m45f5" podUID="8da954ec-226e-4e6e-a43d-0ef4bc182e6c" Apr 22 17:37:03.931675 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:03.931655 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-66csc" podUID="7c3b2116-a369-4692-8afd-099a5a5a39cd" Apr 22 17:37:04.102603 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:04.102564 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-djttm" podUID="34c7625b-b71f-4d8d-a883-c465098dbba7" Apr 22 17:37:04.580033 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:04.579998 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:37:04.580229 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:04.579998 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m45f5" Apr 22 17:37:05.973183 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.973149 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-r7j8x"] Apr 22 17:37:05.979294 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.979264 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:05.981530 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.981503 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:37:05.981785 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.981763 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:37:05.981885 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.981786 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:37:05.982890 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.982863 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5mbzq\"" Apr 22 17:37:05.983155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.983136 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:37:05.984259 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.984239 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl"] Apr 22 17:37:05.988653 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.988636 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:05.990852 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.990838 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 17:37:05.990946 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.990840 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8gm9v\"" Apr 22 17:37:05.990946 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.990883 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:37:05.996454 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:05.996435 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl"] Apr 22 17:37:06.050231 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/843754df-5fe6-43b3-9b50-0b7e518d1c36-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.050231 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050234 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhfd\" (UniqueName: \"kubernetes.io/projected/843754df-5fe6-43b3-9b50-0b7e518d1c36-kube-api-access-jmhfd\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.050471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-tls\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050322 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8d65af8-cf50-4937-a8f0-ee24c1820476-metrics-client-ca\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-root\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-textfile\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-accelerators-collector-config\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.050471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-sys\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050494 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-wtmp\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls7p7\" (UniqueName: \"kubernetes.io/projected/b8d65af8-cf50-4937-a8f0-ee24c1820476-kube-api-access-ls7p7\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.050897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.050568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.151125 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.151125 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-sys\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151326 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-wtmp\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151326 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ls7p7\" (UniqueName: \"kubernetes.io/projected/b8d65af8-cf50-4937-a8f0-ee24c1820476-kube-api-access-ls7p7\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151326 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.151326 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-sys\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151326 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/843754df-5fe6-43b3-9b50-0b7e518d1c36-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.151326 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhfd\" (UniqueName: \"kubernetes.io/projected/843754df-5fe6-43b3-9b50-0b7e518d1c36-kube-api-access-jmhfd\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.151599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-tls\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-wtmp\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8d65af8-cf50-4937-a8f0-ee24c1820476-metrics-client-ca\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-root\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-textfile\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151921 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-accelerators-collector-config\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151921 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b8d65af8-cf50-4937-a8f0-ee24c1820476-root\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.151921 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:06.151847 2572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 17:37:06.151921 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:06.151913 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-tls podName:843754df-5fe6-43b3-9b50-0b7e518d1c36 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:06.651892462 +0000 UTC m=+160.131149142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-vgpzl" (UID: "843754df-5fe6-43b3-9b50-0b7e518d1c36") : secret "openshift-state-metrics-tls" not found Apr 22 17:37:06.152114 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.151987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-textfile\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.152175 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.152120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/843754df-5fe6-43b3-9b50-0b7e518d1c36-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.152228 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.152178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8d65af8-cf50-4937-a8f0-ee24c1820476-metrics-client-ca\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.152277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.152261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-accelerators-collector-config\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.154121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.154089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.154275 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.154258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b8d65af8-cf50-4937-a8f0-ee24c1820476-node-exporter-tls\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.154587 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.154566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.165502 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.165470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls7p7\" (UniqueName: \"kubernetes.io/projected/b8d65af8-cf50-4937-a8f0-ee24c1820476-kube-api-access-ls7p7\") pod \"node-exporter-r7j8x\" (UID: \"b8d65af8-cf50-4937-a8f0-ee24c1820476\") " pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.165609 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.165520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhfd\" (UniqueName: \"kubernetes.io/projected/843754df-5fe6-43b3-9b50-0b7e518d1c36-kube-api-access-jmhfd\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.289209 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.289123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r7j8x" Apr 22 17:37:06.297837 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:06.297811 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8d65af8_cf50_4937_a8f0_ee24c1820476.slice/crio-6593ad481f4e7db8aef92bcb547966c5ed7e8d6d9f0d44423d8fe921e93b498a WatchSource:0}: Error finding container 6593ad481f4e7db8aef92bcb547966c5ed7e8d6d9f0d44423d8fe921e93b498a: Status 404 returned error can't find the container with id 6593ad481f4e7db8aef92bcb547966c5ed7e8d6d9f0d44423d8fe921e93b498a Apr 22 17:37:06.585780 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.585679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r7j8x" event={"ID":"b8d65af8-cf50-4937-a8f0-ee24c1820476","Type":"ContainerStarted","Data":"6593ad481f4e7db8aef92bcb547966c5ed7e8d6d9f0d44423d8fe921e93b498a"} Apr 22 17:37:06.656423 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.656378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.658817 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.658786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/843754df-5fe6-43b3-9b50-0b7e518d1c36-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vgpzl\" (UID: \"843754df-5fe6-43b3-9b50-0b7e518d1c36\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:06.897169 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:06.897133 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" Apr 22 17:37:07.110074 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.107459 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:37:07.112276 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.112250 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.113505 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.113073 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:37:07.116025 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.115514 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:37:07.116025 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.115833 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:37:07.116205 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.116053 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:37:07.116258 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.116235 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:37:07.116875 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.116391 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:37:07.116875 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.116415 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:37:07.116875 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.116458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:37:07.116875 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.116416 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:37:07.116875 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.116607 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:37:07.117200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.116910 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-s8drv\"" Apr 22 17:37:07.161031 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.160673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161031 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.160768 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161031 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.160799 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-volume\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161031 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.160877 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161353 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-out\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161409 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161462 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161514 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161487 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161566 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161620 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161571 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-web-config\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161676 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161755 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.161755 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.161718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89v28\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-kube-api-access-89v28\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.193359 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.193320 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl"] Apr 22 17:37:07.196303 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:07.196275 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod843754df_5fe6_43b3_9b50_0b7e518d1c36.slice/crio-aef931c39c2d6af988320655ae7a36c6107ed404ae026a299280c82f5ff2e3fc WatchSource:0}: Error finding container aef931c39c2d6af988320655ae7a36c6107ed404ae026a299280c82f5ff2e3fc: Status 404 returned error can't find the container with id aef931c39c2d6af988320655ae7a36c6107ed404ae026a299280c82f5ff2e3fc Apr 22 17:37:07.262318 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262397 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262332 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89v28\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-kube-api-access-89v28\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262397 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262506 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262506 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-volume\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262506 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262654 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-out\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262654 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262654 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262654 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262915 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262915 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-web-config\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.262915 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.262756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.264048 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.263145 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.264156 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.264058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.264866 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.264518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.267756 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.267120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.267756 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.267301 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-out\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.268146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.268096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.269033 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.269011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.270605 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.270575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89v28\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-kube-api-access-89v28\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.270888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.270842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-web-config\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.272014 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.271987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.275076 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.275051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-volume\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.275837 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.275575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.277126 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.277102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.424616 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.424520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:07.553833 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.553795 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:37:07.557774 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:07.557745 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c72a2a_ec47_49ee_9aa8_3deafd8ce2f8.slice/crio-eb9ed3546939a7ca3e35f470afc0f59c1fbf5173a950b8d7388874a1bece528a WatchSource:0}: Error finding container eb9ed3546939a7ca3e35f470afc0f59c1fbf5173a950b8d7388874a1bece528a: Status 404 returned error can't find the container with id eb9ed3546939a7ca3e35f470afc0f59c1fbf5173a950b8d7388874a1bece528a Apr 22 17:37:07.589228 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.589190 2572 generic.go:358] "Generic (PLEG): container finished" podID="b8d65af8-cf50-4937-a8f0-ee24c1820476" containerID="a62e18753e22e8fe4cddd2eb55799fd5b89426556bfaf8abf08b10c7ea34b2e9" exitCode=0 Apr 22 17:37:07.589408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.589278 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r7j8x" event={"ID":"b8d65af8-cf50-4937-a8f0-ee24c1820476","Type":"ContainerDied","Data":"a62e18753e22e8fe4cddd2eb55799fd5b89426556bfaf8abf08b10c7ea34b2e9"} Apr 22 17:37:07.591021 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.590997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" event={"ID":"843754df-5fe6-43b3-9b50-0b7e518d1c36","Type":"ContainerStarted","Data":"083eaee5dceabaa70d67e1aba325d2a1e4e408d8e2193d56b9405f490c2830c4"} Apr 22 17:37:07.591145 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.591025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" event={"ID":"843754df-5fe6-43b3-9b50-0b7e518d1c36","Type":"ContainerStarted","Data":"d60e7eb21eed6966fe6f9873e4c42acfaf90daf270792052edf88899c1ede06f"} Apr 22 17:37:07.591145 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.591040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" event={"ID":"843754df-5fe6-43b3-9b50-0b7e518d1c36","Type":"ContainerStarted","Data":"aef931c39c2d6af988320655ae7a36c6107ed404ae026a299280c82f5ff2e3fc"} Apr 22 17:37:07.592098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:07.592077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerStarted","Data":"eb9ed3546939a7ca3e35f470afc0f59c1fbf5173a950b8d7388874a1bece528a"} Apr 22 17:37:08.095767 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.095665 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7969c5c58b-f8f2b"] Apr 22 17:37:08.099724 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.099685 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.102049 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.102021 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 17:37:08.102170 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.102065 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 17:37:08.102170 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.102155 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 17:37:08.102310 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.102211 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-jrn42\"" Apr 22 17:37:08.102415 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.102397 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 17:37:08.102490 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.102461 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-e4kmhp9r18cmm\"" Apr 22 17:37:08.102490 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.102478 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 17:37:08.111449 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.111420 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7969c5c58b-f8f2b"] Apr 22 17:37:08.169301 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.169250 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvqmb\" (UniqueName: \"kubernetes.io/projected/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-kube-api-access-kvqmb\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.169450 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.169361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.169450 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.169431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.169566 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.169462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-tls\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.169566 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.169489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-metrics-client-ca\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.169566 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.169551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.169750 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.169727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.169833 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.169780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-grpc-tls\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.270531 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.270494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.270730 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.270554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-grpc-tls\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.270730 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.270604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvqmb\" (UniqueName: \"kubernetes.io/projected/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-kube-api-access-kvqmb\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.270730 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.270649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.270730 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.270722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.270935 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.270748 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-tls\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.270935 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.270776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-metrics-client-ca\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.270935 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.270809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.271809 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.271752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-metrics-client-ca\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.273522 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.273471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.274165 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.274040 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.274311 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.274287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.274464 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.274441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-tls\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.274759 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.274728 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.275305 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.275284 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-secret-grpc-tls\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.278351 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.278330 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvqmb\" (UniqueName: \"kubernetes.io/projected/d32d0e48-df72-4243-81b9-ba8cf37c6bb6-kube-api-access-kvqmb\") pod \"thanos-querier-7969c5c58b-f8f2b\" (UID: \"d32d0e48-df72-4243-81b9-ba8cf37c6bb6\") " pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.410753 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.410692 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:08.545014 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.544978 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7969c5c58b-f8f2b"] Apr 22 17:37:08.602663 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.602621 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r7j8x" event={"ID":"b8d65af8-cf50-4937-a8f0-ee24c1820476","Type":"ContainerStarted","Data":"71799c12091a20efe5cad34cee405faa161abeac6958b83a21f28405258470df"} Apr 22 17:37:08.602663 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.602668 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r7j8x" event={"ID":"b8d65af8-cf50-4937-a8f0-ee24c1820476","Type":"ContainerStarted","Data":"5c375cde5b271b9a9ed61b2260799b0e72b343da91100dadb06d66a55072e853"} Apr 22 17:37:08.604974 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.604943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" event={"ID":"843754df-5fe6-43b3-9b50-0b7e518d1c36","Type":"ContainerStarted","Data":"bf30ce0cfe7e599f3fee15e1e0dcc1e3d2947c98fb98ee86f1303dcafdc44476"} Apr 22 17:37:08.620436 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.620375 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-r7j8x" podStartSLOduration=2.832440493 podStartE2EDuration="3.620357454s" podCreationTimestamp="2026-04-22 17:37:05 +0000 UTC" firstStartedPulling="2026-04-22 17:37:06.299932777 +0000 UTC m=+159.779189454" lastFinishedPulling="2026-04-22 17:37:07.087849752 +0000 UTC m=+160.567106415" observedRunningTime="2026-04-22 17:37:08.619769409 +0000 UTC m=+162.099026092" watchObservedRunningTime="2026-04-22 17:37:08.620357454 +0000 UTC m=+162.099614135" Apr 22 17:37:08.637889 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.637837 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vgpzl" podStartSLOduration=2.683284606 podStartE2EDuration="3.637822333s" podCreationTimestamp="2026-04-22 17:37:05 +0000 UTC" firstStartedPulling="2026-04-22 17:37:07.332420776 +0000 UTC m=+160.811677436" lastFinishedPulling="2026-04-22 17:37:08.286958495 +0000 UTC m=+161.766215163" observedRunningTime="2026-04-22 17:37:08.637054505 +0000 UTC m=+162.116311189" watchObservedRunningTime="2026-04-22 17:37:08.637822333 +0000 UTC m=+162.117079017" Apr 22 17:37:08.745733 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:08.745645 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd32d0e48_df72_4243_81b9_ba8cf37c6bb6.slice/crio-0e22768419f3625b93adce3b61b14de75d639be3f52275b60dfae229aee2cdc6 WatchSource:0}: Error finding container 0e22768419f3625b93adce3b61b14de75d639be3f52275b60dfae229aee2cdc6: Status 404 returned error can't find the container with id 0e22768419f3625b93adce3b61b14de75d639be3f52275b60dfae229aee2cdc6 Apr 22 17:37:08.775262 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.775229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:37:08.775426 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.775307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:37:08.777532 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.777505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8da954ec-226e-4e6e-a43d-0ef4bc182e6c-metrics-tls\") pod \"dns-default-m45f5\" (UID: \"8da954ec-226e-4e6e-a43d-0ef4bc182e6c\") " pod="openshift-dns/dns-default-m45f5" Apr 22 17:37:08.777670 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.777652 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c3b2116-a369-4692-8afd-099a5a5a39cd-cert\") pod \"ingress-canary-66csc\" (UID: \"7c3b2116-a369-4692-8afd-099a5a5a39cd\") " pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:37:08.783570 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.783550 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dsn9k\"" Apr 22 17:37:08.783635 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.783607 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7g4b5\"" Apr 22 17:37:08.790903 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.790879 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-66csc" Apr 22 17:37:08.791015 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.790976 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m45f5" Apr 22 17:37:08.925885 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.925855 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-66csc"] Apr 22 17:37:08.942519 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:08.942488 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m45f5"] Apr 22 17:37:08.948740 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:08.948709 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c3b2116_a369_4692_8afd_099a5a5a39cd.slice/crio-488006a79a7310be0b51a3fc18b0a7c107662f3b6f8c71044fea2e48ef26b280 WatchSource:0}: Error finding container 488006a79a7310be0b51a3fc18b0a7c107662f3b6f8c71044fea2e48ef26b280: Status 404 returned error can't find the container with id 488006a79a7310be0b51a3fc18b0a7c107662f3b6f8c71044fea2e48ef26b280 Apr 22 17:37:08.949026 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:08.948981 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8da954ec_226e_4e6e_a43d_0ef4bc182e6c.slice/crio-4917d5eafb64d9de138755c7533920a2a0f0162c147e175a3f34dea701e59d18 WatchSource:0}: Error finding container 4917d5eafb64d9de138755c7533920a2a0f0162c147e175a3f34dea701e59d18: Status 404 returned error can't find the container with id 4917d5eafb64d9de138755c7533920a2a0f0162c147e175a3f34dea701e59d18 Apr 22 17:37:09.611663 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:09.610599 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerID="1c83890e6fa1ff4785d031446cfbd4e62084304f3e5f41e4df55eefd6585b4f6" exitCode=0 Apr 22 17:37:09.611663 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:09.610676 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerDied","Data":"1c83890e6fa1ff4785d031446cfbd4e62084304f3e5f41e4df55eefd6585b4f6"} Apr 22 17:37:09.616654 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:09.616454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-66csc" event={"ID":"7c3b2116-a369-4692-8afd-099a5a5a39cd","Type":"ContainerStarted","Data":"488006a79a7310be0b51a3fc18b0a7c107662f3b6f8c71044fea2e48ef26b280"} Apr 22 17:37:09.618855 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:09.618823 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m45f5" event={"ID":"8da954ec-226e-4e6e-a43d-0ef4bc182e6c","Type":"ContainerStarted","Data":"4917d5eafb64d9de138755c7533920a2a0f0162c147e175a3f34dea701e59d18"} Apr 22 17:37:09.621751 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:09.620992 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" event={"ID":"d32d0e48-df72-4243-81b9-ba8cf37c6bb6","Type":"ContainerStarted","Data":"0e22768419f3625b93adce3b61b14de75d639be3f52275b60dfae229aee2cdc6"} Apr 22 17:37:09.740926 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:09.740881 2572 patch_prober.go:28] interesting pod/image-registry-d6bdb5fd9-bg7x4 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:37:09.741125 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:09.740947 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" podUID="579a3ebc-9a66-4781-8220-a94c0b7a620a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:37:12.630043 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.629950 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" event={"ID":"d32d0e48-df72-4243-81b9-ba8cf37c6bb6","Type":"ContainerStarted","Data":"9d5a16e57fcad9a4f6831df5088f18809c23e233aaf08209993134f37567101c"} Apr 22 17:37:12.630043 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.629992 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" event={"ID":"d32d0e48-df72-4243-81b9-ba8cf37c6bb6","Type":"ContainerStarted","Data":"fd8d4f55f72c3f78c2e1617203c2a5684f9304ee6b4bc54a805ffe72880c4c49"} Apr 22 17:37:12.630043 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.630007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" event={"ID":"d32d0e48-df72-4243-81b9-ba8cf37c6bb6","Type":"ContainerStarted","Data":"07d8ab06f5c8204aded469d0c82a971bd2b45d90e078378a4acd695a7b7e668b"} Apr 22 17:37:12.632540 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.632516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerStarted","Data":"a02ed83049c2bea8be6e5c21dda92b70a9305beb72416c87a005a4040af12fbb"} Apr 22 17:37:12.632540 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.632542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerStarted","Data":"4ef67ebd68307f7af374760666ea396969d1823f08a56cacdee89e149b2a4c91"} Apr 22 17:37:12.632691 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.632552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerStarted","Data":"cc6994d8486613b07143bdf840ac5d96f849c886d5dfafe8d7af6956ec435b66"} Apr 22 17:37:12.632691 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.632561 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerStarted","Data":"36225688c1b61ca2b743e3efe1fd31f1eb329ad19412922887c4ced21f0cef53"} Apr 22 17:37:12.632691 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.632568 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerStarted","Data":"edab840dd9b72c81d6f451eaec2c99ed7cf113a51dc49e13c6cf80b7ff91aee6"} Apr 22 17:37:12.633810 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.633782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-66csc" event={"ID":"7c3b2116-a369-4692-8afd-099a5a5a39cd","Type":"ContainerStarted","Data":"bafc4559ee9344470de420ea5eac008c3bd11d5df2a5a0c5f51541d33310a50e"} Apr 22 17:37:12.635219 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.635199 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m45f5" event={"ID":"8da954ec-226e-4e6e-a43d-0ef4bc182e6c","Type":"ContainerStarted","Data":"5fd90288a75ace4fdca249373ef6de35014308bb546792e6778760a7affaf94a"} Apr 22 17:37:12.635313 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.635222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m45f5" event={"ID":"8da954ec-226e-4e6e-a43d-0ef4bc182e6c","Type":"ContainerStarted","Data":"53b703354ddad44666f6ab07d9004d57aed1d25722c5f589dac1c25d993e30b4"} Apr 22 17:37:12.635357 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.635319 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-m45f5" Apr 22 17:37:12.650302 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.650259 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-66csc" podStartSLOduration=129.632265596 podStartE2EDuration="2m12.650246402s" podCreationTimestamp="2026-04-22 17:35:00 +0000 UTC" firstStartedPulling="2026-04-22 17:37:08.950965283 +0000 UTC m=+162.430221944" lastFinishedPulling="2026-04-22 17:37:11.968946089 +0000 UTC m=+165.448202750" observedRunningTime="2026-04-22 17:37:12.64924127 +0000 UTC m=+166.128497951" watchObservedRunningTime="2026-04-22 17:37:12.650246402 +0000 UTC m=+166.129503078" Apr 22 17:37:12.666252 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:12.666190 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m45f5" podStartSLOduration=129.647808601 podStartE2EDuration="2m12.666169172s" podCreationTimestamp="2026-04-22 17:35:00 +0000 UTC" firstStartedPulling="2026-04-22 17:37:08.950966158 +0000 UTC m=+162.430222833" lastFinishedPulling="2026-04-22 17:37:11.969326741 +0000 UTC m=+165.448583404" observedRunningTime="2026-04-22 17:37:12.665752948 +0000 UTC m=+166.145009632" watchObservedRunningTime="2026-04-22 17:37:12.666169172 +0000 UTC m=+166.145425854" Apr 22 17:37:13.641126 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:13.641090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerStarted","Data":"68053045e9b350de236356f912e74a22c8c5d8e1c5f8d30c00b73ccd5b280cfa"} Apr 22 17:37:13.643598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:13.643567 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" event={"ID":"d32d0e48-df72-4243-81b9-ba8cf37c6bb6","Type":"ContainerStarted","Data":"77e3cfd69ba7e684e176e2533923c63414292f5acf6bd585c51272f83b6e11ca"} Apr 22 17:37:13.643751 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:13.643603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" event={"ID":"d32d0e48-df72-4243-81b9-ba8cf37c6bb6","Type":"ContainerStarted","Data":"b0a3ea4034bdd6c11179020ec99b0bb0c9ee592d3f66d7af2547483a764e520a"} Apr 22 17:37:13.643751 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:13.643613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" event={"ID":"d32d0e48-df72-4243-81b9-ba8cf37c6bb6","Type":"ContainerStarted","Data":"79d5a97db355ebf2590eeb14cdb8bf6164b55d55b5226977344fc8efeb9305aa"} Apr 22 17:37:13.644093 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:13.644070 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:13.666755 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:13.666691 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.043717948 podStartE2EDuration="6.666675359s" podCreationTimestamp="2026-04-22 17:37:07 +0000 UTC" firstStartedPulling="2026-04-22 17:37:07.55969355 +0000 UTC m=+161.038950209" lastFinishedPulling="2026-04-22 17:37:13.182650957 +0000 UTC m=+166.661907620" observedRunningTime="2026-04-22 17:37:13.665818636 +0000 UTC m=+167.145075318" watchObservedRunningTime="2026-04-22 17:37:13.666675359 +0000 UTC m=+167.145932038" Apr 22 17:37:13.686789 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:13.686604 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" podStartSLOduration=1.252848831 podStartE2EDuration="5.686584128s" podCreationTimestamp="2026-04-22 17:37:08 +0000 UTC" firstStartedPulling="2026-04-22 17:37:08.747431506 +0000 UTC m=+162.226688166" lastFinishedPulling="2026-04-22 17:37:13.181166783 +0000 UTC m=+166.660423463" observedRunningTime="2026-04-22 17:37:13.685217934 +0000 UTC m=+167.164474618" watchObservedRunningTime="2026-04-22 17:37:13.686584128 +0000 UTC m=+167.165840811" Apr 22 17:37:15.098128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:15.098102 2572 scope.go:117] "RemoveContainer" containerID="5490afec7a733faa6600f0ad18a59671b8699bc70b2294e68b3c3011f06195de" Apr 22 17:37:15.098471 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:15.098283 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gpsv4_openshift-console-operator(31b62473-ecf9-4e50-926e-50d8d8ca9231)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" podUID="31b62473-ecf9-4e50-926e-50d8d8ca9231" Apr 22 17:37:17.096309 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:17.096222 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:37:19.652874 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:19.652840 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7969c5c58b-f8f2b" Apr 22 17:37:19.735244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:19.735218 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:37:22.645541 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:22.645512 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m45f5" Apr 22 17:37:24.749513 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:24.749453 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" podUID="579a3ebc-9a66-4781-8220-a94c0b7a620a" containerName="registry" containerID="cri-o://6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347" gracePeriod=30 Apr 22 17:37:24.974765 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:24.974743 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:37:25.124645 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.124567 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-certificates\") pod \"579a3ebc-9a66-4781-8220-a94c0b7a620a\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " Apr 22 17:37:25.124645 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.124612 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-image-registry-private-configuration\") pod \"579a3ebc-9a66-4781-8220-a94c0b7a620a\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " Apr 22 17:37:25.124645 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.124641 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-trusted-ca\") pod \"579a3ebc-9a66-4781-8220-a94c0b7a620a\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " Apr 22 17:37:25.124911 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.124666 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-bound-sa-token\") pod \"579a3ebc-9a66-4781-8220-a94c0b7a620a\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " Apr 22 17:37:25.124911 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.124746 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-865ls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-kube-api-access-865ls\") pod \"579a3ebc-9a66-4781-8220-a94c0b7a620a\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " Apr 22 17:37:25.124911 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.124780 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/579a3ebc-9a66-4781-8220-a94c0b7a620a-ca-trust-extracted\") pod \"579a3ebc-9a66-4781-8220-a94c0b7a620a\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " Apr 22 17:37:25.124911 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.124815 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") pod \"579a3ebc-9a66-4781-8220-a94c0b7a620a\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " Apr 22 17:37:25.124911 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.124842 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-installation-pull-secrets\") pod \"579a3ebc-9a66-4781-8220-a94c0b7a620a\" (UID: \"579a3ebc-9a66-4781-8220-a94c0b7a620a\") " Apr 22 17:37:25.125229 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.125194 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "579a3ebc-9a66-4781-8220-a94c0b7a620a" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:37:25.125317 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.125203 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "579a3ebc-9a66-4781-8220-a94c0b7a620a" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:37:25.127524 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.127490 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "579a3ebc-9a66-4781-8220-a94c0b7a620a" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:37:25.127622 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.127552 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "579a3ebc-9a66-4781-8220-a94c0b7a620a" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:37:25.127622 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.127573 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "579a3ebc-9a66-4781-8220-a94c0b7a620a" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:37:25.127760 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.127743 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-kube-api-access-865ls" (OuterVolumeSpecName: "kube-api-access-865ls") pod "579a3ebc-9a66-4781-8220-a94c0b7a620a" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a"). InnerVolumeSpecName "kube-api-access-865ls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:37:25.127760 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.127751 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "579a3ebc-9a66-4781-8220-a94c0b7a620a" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:37:25.133428 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.133404 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/579a3ebc-9a66-4781-8220-a94c0b7a620a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "579a3ebc-9a66-4781-8220-a94c0b7a620a" (UID: "579a3ebc-9a66-4781-8220-a94c0b7a620a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:37:25.226138 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.226097 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-certificates\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:37:25.226138 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.226130 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-image-registry-private-configuration\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:37:25.226138 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.226144 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/579a3ebc-9a66-4781-8220-a94c0b7a620a-trusted-ca\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:37:25.226374 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.226157 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-bound-sa-token\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:37:25.226374 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.226169 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-865ls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-kube-api-access-865ls\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:37:25.226374 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.226181 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/579a3ebc-9a66-4781-8220-a94c0b7a620a-ca-trust-extracted\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:37:25.226374 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.226192 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/579a3ebc-9a66-4781-8220-a94c0b7a620a-registry-tls\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:37:25.226374 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.226206 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/579a3ebc-9a66-4781-8220-a94c0b7a620a-installation-pull-secrets\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:37:25.677410 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.677376 2572 generic.go:358] "Generic (PLEG): container finished" podID="579a3ebc-9a66-4781-8220-a94c0b7a620a" containerID="6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347" exitCode=0 Apr 22 17:37:25.677576 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.677462 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" Apr 22 17:37:25.677576 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.677457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" event={"ID":"579a3ebc-9a66-4781-8220-a94c0b7a620a","Type":"ContainerDied","Data":"6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347"} Apr 22 17:37:25.677576 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.677564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d6bdb5fd9-bg7x4" event={"ID":"579a3ebc-9a66-4781-8220-a94c0b7a620a","Type":"ContainerDied","Data":"2eba8b9e24a161a17f2e4c79780d6e7ec309b515042bd1400862323175bf55e9"} Apr 22 17:37:25.677718 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.677580 2572 scope.go:117] "RemoveContainer" containerID="6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347" Apr 22 17:37:25.685799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.685779 2572 scope.go:117] "RemoveContainer" containerID="6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347" Apr 22 17:37:25.686075 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:37:25.686054 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347\": container with ID starting with 6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347 not found: ID does not exist" containerID="6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347" Apr 22 17:37:25.686124 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.686083 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347"} err="failed to get container status \"6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347\": rpc error: code = NotFound desc = could not find container \"6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347\": container with ID starting with 6b6a9677af871a3d63fa2cb1dbabc16d0f7b0033831c11d2596d81e79a979347 not found: ID does not exist" Apr 22 17:37:25.698429 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.698410 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d6bdb5fd9-bg7x4"] Apr 22 17:37:25.702239 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:25.702219 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-d6bdb5fd9-bg7x4"] Apr 22 17:37:27.099294 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:27.099263 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579a3ebc-9a66-4781-8220-a94c0b7a620a" path="/var/lib/kubelet/pods/579a3ebc-9a66-4781-8220-a94c0b7a620a/volumes" Apr 22 17:37:30.094620 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.094589 2572 scope.go:117] "RemoveContainer" containerID="5490afec7a733faa6600f0ad18a59671b8699bc70b2294e68b3c3011f06195de" Apr 22 17:37:30.692953 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.692923 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:37:30.693135 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.693020 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" event={"ID":"31b62473-ecf9-4e50-926e-50d8d8ca9231","Type":"ContainerStarted","Data":"7e283113b65c9e43a1792818054243a25ccc84a5b63cbe80bfdfe9afb71ce74c"} Apr 22 17:37:30.693311 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.693291 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:37:30.796020 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.795989 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-gpsv4" Apr 22 17:37:30.974622 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.974542 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-74lqm"] Apr 22 17:37:30.974910 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.974897 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="579a3ebc-9a66-4781-8220-a94c0b7a620a" containerName="registry" Apr 22 17:37:30.974957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.974912 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="579a3ebc-9a66-4781-8220-a94c0b7a620a" containerName="registry" Apr 22 17:37:30.974991 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.974969 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="579a3ebc-9a66-4781-8220-a94c0b7a620a" containerName="registry" Apr 22 17:37:30.977923 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.977903 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-74lqm" Apr 22 17:37:30.982923 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.982814 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-gwclw\"" Apr 22 17:37:30.982923 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.982846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:37:30.983139 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.982987 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:37:30.994813 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:30.994790 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-74lqm"] Apr 22 17:37:31.077837 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:31.077798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbxm\" (UniqueName: \"kubernetes.io/projected/df068068-e066-4a5c-86c3-8ad5f03f5f19-kube-api-access-gbbxm\") pod \"downloads-6bcc868b7-74lqm\" (UID: \"df068068-e066-4a5c-86c3-8ad5f03f5f19\") " pod="openshift-console/downloads-6bcc868b7-74lqm" Apr 22 17:37:31.178992 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:31.178959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbxm\" (UniqueName: \"kubernetes.io/projected/df068068-e066-4a5c-86c3-8ad5f03f5f19-kube-api-access-gbbxm\") pod \"downloads-6bcc868b7-74lqm\" (UID: \"df068068-e066-4a5c-86c3-8ad5f03f5f19\") " pod="openshift-console/downloads-6bcc868b7-74lqm" Apr 22 17:37:31.188347 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:31.188315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbxm\" (UniqueName: \"kubernetes.io/projected/df068068-e066-4a5c-86c3-8ad5f03f5f19-kube-api-access-gbbxm\") pod \"downloads-6bcc868b7-74lqm\" (UID: \"df068068-e066-4a5c-86c3-8ad5f03f5f19\") " pod="openshift-console/downloads-6bcc868b7-74lqm" Apr 22 17:37:31.287501 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:31.287413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-74lqm" Apr 22 17:37:31.411090 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:31.411061 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-74lqm"] Apr 22 17:37:31.413949 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:31.413922 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf068068_e066_4a5c_86c3_8ad5f03f5f19.slice/crio-4d35cbe0a30c6281b22eca382faa7892be644ec66ebc3124ca58ce89dfc1f8fc WatchSource:0}: Error finding container 4d35cbe0a30c6281b22eca382faa7892be644ec66ebc3124ca58ce89dfc1f8fc: Status 404 returned error can't find the container with id 4d35cbe0a30c6281b22eca382faa7892be644ec66ebc3124ca58ce89dfc1f8fc Apr 22 17:37:31.697045 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:31.697007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-74lqm" event={"ID":"df068068-e066-4a5c-86c3-8ad5f03f5f19","Type":"ContainerStarted","Data":"4d35cbe0a30c6281b22eca382faa7892be644ec66ebc3124ca58ce89dfc1f8fc"} Apr 22 17:37:41.900581 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.900548 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-847d7bd684-x2fkt"] Apr 22 17:37:41.905598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.905577 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:41.909100 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.909076 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:37:41.909213 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.909084 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:37:41.909213 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.909084 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:37:41.909213 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.909131 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8zgq2\"" Apr 22 17:37:41.909213 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.909085 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:37:41.909213 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.909171 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:37:41.915594 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:41.915573 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-847d7bd684-x2fkt"] Apr 22 17:37:42.074658 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.074622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-oauth-serving-cert\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.074856 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.074693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56cd\" (UniqueName: \"kubernetes.io/projected/e29805e0-ecc6-4d63-aebe-031e1646b3cd-kube-api-access-w56cd\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.074856 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.074743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-service-ca\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.074856 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.074808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-config\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.075011 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.074859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-serving-cert\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.075011 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.074926 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-oauth-config\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.175805 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.175718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-config\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.175805 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.175768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-serving-cert\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.176030 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.175811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-oauth-config\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.176030 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.175867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-oauth-serving-cert\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.176030 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.175925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w56cd\" (UniqueName: \"kubernetes.io/projected/e29805e0-ecc6-4d63-aebe-031e1646b3cd-kube-api-access-w56cd\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.176030 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.175955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-service-ca\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.176656 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.176538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-config\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.176656 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.176594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-oauth-serving-cert\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.176846 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.176771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-service-ca\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.178888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.178866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-oauth-config\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.178964 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.178921 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-serving-cert\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.185373 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.185327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56cd\" (UniqueName: \"kubernetes.io/projected/e29805e0-ecc6-4d63-aebe-031e1646b3cd-kube-api-access-w56cd\") pod \"console-847d7bd684-x2fkt\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.216215 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.216185 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:42.348216 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.348182 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-847d7bd684-x2fkt"] Apr 22 17:37:42.350789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:42.350749 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode29805e0_ecc6_4d63_aebe_031e1646b3cd.slice/crio-1027b68e3c1d00cc44496515ea5ff0bdd612524836235b3a9639796ad02fa965 WatchSource:0}: Error finding container 1027b68e3c1d00cc44496515ea5ff0bdd612524836235b3a9639796ad02fa965: Status 404 returned error can't find the container with id 1027b68e3c1d00cc44496515ea5ff0bdd612524836235b3a9639796ad02fa965 Apr 22 17:37:42.728300 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:42.728263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847d7bd684-x2fkt" event={"ID":"e29805e0-ecc6-4d63-aebe-031e1646b3cd","Type":"ContainerStarted","Data":"1027b68e3c1d00cc44496515ea5ff0bdd612524836235b3a9639796ad02fa965"} Apr 22 17:37:45.739474 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:45.739417 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847d7bd684-x2fkt" event={"ID":"e29805e0-ecc6-4d63-aebe-031e1646b3cd","Type":"ContainerStarted","Data":"c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8"} Apr 22 17:37:45.759971 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:45.759913 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-847d7bd684-x2fkt" podStartSLOduration=1.509448623 podStartE2EDuration="4.759892443s" podCreationTimestamp="2026-04-22 17:37:41 +0000 UTC" firstStartedPulling="2026-04-22 17:37:42.353132035 +0000 UTC m=+195.832388698" lastFinishedPulling="2026-04-22 17:37:45.603575855 +0000 UTC m=+199.082832518" observedRunningTime="2026-04-22 17:37:45.758711267 +0000 UTC m=+199.237967942" watchObservedRunningTime="2026-04-22 17:37:45.759892443 +0000 UTC m=+199.239149150" Apr 22 17:37:52.217249 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.217211 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:52.217249 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.217258 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:52.222459 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.222431 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:52.278367 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.278333 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fc87c8d7b-q9g75"] Apr 22 17:37:52.281978 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.281954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.289144 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.289120 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 17:37:52.295461 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.295439 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fc87c8d7b-q9g75"] Apr 22 17:37:52.370653 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.370620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-service-ca\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.370653 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.370663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txl29\" (UniqueName: \"kubernetes.io/projected/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-kube-api-access-txl29\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.370897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.370815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-oauth-serving-cert\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.370897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.370891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-serving-cert\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.370990 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.370921 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-config\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.371041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.371000 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-trusted-ca-bundle\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.371094 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.371047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-oauth-config\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.472359 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.472264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-oauth-serving-cert\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.472359 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.472335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-serving-cert\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.472599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.472426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-config\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.472599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.472484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-trusted-ca-bundle\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.472599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.472528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-oauth-config\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.472599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.472578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-service-ca\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.472813 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.472608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txl29\" (UniqueName: \"kubernetes.io/projected/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-kube-api-access-txl29\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.473508 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.473476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-oauth-serving-cert\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.473624 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.473512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-config\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.473875 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.473852 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-trusted-ca-bundle\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.474074 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.474051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-service-ca\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.475276 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.475251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-oauth-config\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.475403 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.475358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-serving-cert\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.481753 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.481729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txl29\" (UniqueName: \"kubernetes.io/projected/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-kube-api-access-txl29\") pod \"console-6fc87c8d7b-q9g75\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.593714 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.593649 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:37:52.764693 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:52.764610 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:37:55.441039 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.441013 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fc87c8d7b-q9g75"] Apr 22 17:37:55.456021 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:37:55.455986 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884d8fa6_6f42_4cfe_8f7b_9a97fb26dab6.slice/crio-39a9fdf14326ee873da5c94f5bdc13d06195b69dc1c6cfe2ef66cb62a2aa526d WatchSource:0}: Error finding container 39a9fdf14326ee873da5c94f5bdc13d06195b69dc1c6cfe2ef66cb62a2aa526d: Status 404 returned error can't find the container with id 39a9fdf14326ee873da5c94f5bdc13d06195b69dc1c6cfe2ef66cb62a2aa526d Apr 22 17:37:55.770678 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.770644 2572 generic.go:358] "Generic (PLEG): container finished" podID="611abd7b-ca37-43e2-a480-f0fc4b1aa306" containerID="b27d4df6740cebfd169fc5c555abd9225586edc21a10f7717a5bda3149e9005d" exitCode=0 Apr 22 17:37:55.770888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.770729 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k6554" event={"ID":"611abd7b-ca37-43e2-a480-f0fc4b1aa306","Type":"ContainerDied","Data":"b27d4df6740cebfd169fc5c555abd9225586edc21a10f7717a5bda3149e9005d"} Apr 22 17:37:55.771246 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.771224 2572 scope.go:117] "RemoveContainer" containerID="b27d4df6740cebfd169fc5c555abd9225586edc21a10f7717a5bda3149e9005d" Apr 22 17:37:55.772432 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.772395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fc87c8d7b-q9g75" event={"ID":"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6","Type":"ContainerStarted","Data":"06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc"} Apr 22 17:37:55.772432 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.772427 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fc87c8d7b-q9g75" event={"ID":"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6","Type":"ContainerStarted","Data":"39a9fdf14326ee873da5c94f5bdc13d06195b69dc1c6cfe2ef66cb62a2aa526d"} Apr 22 17:37:55.774505 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.774473 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-74lqm" event={"ID":"df068068-e066-4a5c-86c3-8ad5f03f5f19","Type":"ContainerStarted","Data":"3a57de12b0518385d0e9266259ab715cf4c0a9f1464d35eafe89930f5b053544"} Apr 22 17:37:55.774743 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.774687 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-74lqm" Apr 22 17:37:55.790432 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.790411 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-74lqm" Apr 22 17:37:55.803514 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.803434 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-74lqm" podStartSLOduration=1.806423417 podStartE2EDuration="25.80341494s" podCreationTimestamp="2026-04-22 17:37:30 +0000 UTC" firstStartedPulling="2026-04-22 17:37:31.416066197 +0000 UTC m=+184.895322857" lastFinishedPulling="2026-04-22 17:37:55.413057717 +0000 UTC m=+208.892314380" observedRunningTime="2026-04-22 17:37:55.802279165 +0000 UTC m=+209.281535848" watchObservedRunningTime="2026-04-22 17:37:55.80341494 +0000 UTC m=+209.282671623" Apr 22 17:37:55.819509 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:55.819466 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fc87c8d7b-q9g75" podStartSLOduration=3.819450228 podStartE2EDuration="3.819450228s" podCreationTimestamp="2026-04-22 17:37:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:37:55.817548923 +0000 UTC m=+209.296805605" watchObservedRunningTime="2026-04-22 17:37:55.819450228 +0000 UTC m=+209.298706911" Apr 22 17:37:56.779527 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:37:56.779479 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k6554" event={"ID":"611abd7b-ca37-43e2-a480-f0fc4b1aa306","Type":"ContainerStarted","Data":"8d9b332f2b59528019e67f8f6cf2788c79d369830b4735f3f0d8ead86dec536a"} Apr 22 17:38:02.594836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:02.594795 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:38:02.595385 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:02.594964 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:38:02.600715 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:02.600671 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:38:02.802302 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:02.802261 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:38:02.842813 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:02.842773 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-847d7bd684-x2fkt"] Apr 22 17:38:26.337602 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.337563 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:38:26.338106 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.338078 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="alertmanager" containerID="cri-o://edab840dd9b72c81d6f451eaec2c99ed7cf113a51dc49e13c6cf80b7ff91aee6" gracePeriod=120 Apr 22 17:38:26.338187 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.338130 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy-metric" containerID="cri-o://a02ed83049c2bea8be6e5c21dda92b70a9305beb72416c87a005a4040af12fbb" gracePeriod=120 Apr 22 17:38:26.338246 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.338172 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy-web" containerID="cri-o://cc6994d8486613b07143bdf840ac5d96f849c886d5dfafe8d7af6956ec435b66" gracePeriod=120 Apr 22 17:38:26.338246 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.338187 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy" containerID="cri-o://4ef67ebd68307f7af374760666ea396969d1823f08a56cacdee89e149b2a4c91" gracePeriod=120 Apr 22 17:38:26.338340 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.338226 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="config-reloader" containerID="cri-o://36225688c1b61ca2b743e3efe1fd31f1eb329ad19412922887c4ced21f0cef53" gracePeriod=120 Apr 22 17:38:26.338340 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.338196 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="prom-label-proxy" containerID="cri-o://68053045e9b350de236356f912e74a22c8c5d8e1c5f8d30c00b73ccd5b280cfa" gracePeriod=120 Apr 22 17:38:26.871198 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.871169 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerID="68053045e9b350de236356f912e74a22c8c5d8e1c5f8d30c00b73ccd5b280cfa" exitCode=0 Apr 22 17:38:26.871198 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.871192 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerID="4ef67ebd68307f7af374760666ea396969d1823f08a56cacdee89e149b2a4c91" exitCode=0 Apr 22 17:38:26.871198 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.871199 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerID="36225688c1b61ca2b743e3efe1fd31f1eb329ad19412922887c4ced21f0cef53" exitCode=0 Apr 22 17:38:26.871198 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.871204 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerID="edab840dd9b72c81d6f451eaec2c99ed7cf113a51dc49e13c6cf80b7ff91aee6" exitCode=0 Apr 22 17:38:26.871465 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.871240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerDied","Data":"68053045e9b350de236356f912e74a22c8c5d8e1c5f8d30c00b73ccd5b280cfa"} Apr 22 17:38:26.871465 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.871272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerDied","Data":"4ef67ebd68307f7af374760666ea396969d1823f08a56cacdee89e149b2a4c91"} Apr 22 17:38:26.871465 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.871282 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerDied","Data":"36225688c1b61ca2b743e3efe1fd31f1eb329ad19412922887c4ced21f0cef53"} Apr 22 17:38:26.871465 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:26.871291 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerDied","Data":"edab840dd9b72c81d6f451eaec2c99ed7cf113a51dc49e13c6cf80b7ff91aee6"} Apr 22 17:38:27.866329 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:27.866277 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-847d7bd684-x2fkt" podUID="e29805e0-ecc6-4d63-aebe-031e1646b3cd" containerName="console" containerID="cri-o://c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8" gracePeriod=15 Apr 22 17:38:27.877775 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:27.877747 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerID="a02ed83049c2bea8be6e5c21dda92b70a9305beb72416c87a005a4040af12fbb" exitCode=0 Apr 22 17:38:27.877775 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:27.877771 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerID="cc6994d8486613b07143bdf840ac5d96f849c886d5dfafe8d7af6956ec435b66" exitCode=0 Apr 22 17:38:27.877942 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:27.877815 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerDied","Data":"a02ed83049c2bea8be6e5c21dda92b70a9305beb72416c87a005a4040af12fbb"} Apr 22 17:38:27.877942 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:27.877848 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerDied","Data":"cc6994d8486613b07143bdf840ac5d96f849c886d5dfafe8d7af6956ec435b66"} Apr 22 17:38:28.098103 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.098078 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:28.129393 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.129369 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-847d7bd684-x2fkt_e29805e0-ecc6-4d63-aebe-031e1646b3cd/console/0.log" Apr 22 17:38:28.129556 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.129437 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:38:28.194719 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194663 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-serving-cert\") pod \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " Apr 22 17:38:28.194719 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194724 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-metrics-client-ca\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.194957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194747 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-main-tls\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.194957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194767 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89v28\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-kube-api-access-89v28\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.194957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194782 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-trusted-ca-bundle\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.194957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194806 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-tls-assets\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.194957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194836 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-web-config\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.194957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194881 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.194957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194907 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w56cd\" (UniqueName: \"kubernetes.io/projected/e29805e0-ecc6-4d63-aebe-031e1646b3cd-kube-api-access-w56cd\") pod \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " Apr 22 17:38:28.194957 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.194935 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-volume\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.195401 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195199 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:28.195401 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195218 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195597 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-out\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195677 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-web\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195727 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195781 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-cluster-tls-config\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195815 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-oauth-config\") pod \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195856 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-service-ca\") pod \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195884 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-config\") pod \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195925 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-main-db\") pod \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\" (UID: \"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.195968 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-oauth-serving-cert\") pod \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\" (UID: \"e29805e0-ecc6-4d63-aebe-031e1646b3cd\") " Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.196915 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-config" (OuterVolumeSpecName: "console-config") pod "e29805e0-ecc6-4d63-aebe-031e1646b3cd" (UID: "e29805e0-ecc6-4d63-aebe-031e1646b3cd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.197331 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-service-ca" (OuterVolumeSpecName: "service-ca") pod "e29805e0-ecc6-4d63-aebe-031e1646b3cd" (UID: "e29805e0-ecc6-4d63-aebe-031e1646b3cd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.197626 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.197728 2572 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-metrics-client-ca\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.197747 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.197760 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-service-ca\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.197772 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.197868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.197783 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-alertmanager-main-db\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.198763 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.197954 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e29805e0-ecc6-4d63-aebe-031e1646b3cd" (UID: "e29805e0-ecc6-4d63-aebe-031e1646b3cd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:28.198763 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.198440 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e29805e0-ecc6-4d63-aebe-031e1646b3cd" (UID: "e29805e0-ecc6-4d63-aebe-031e1646b3cd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.198763 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.198449 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-volume" (OuterVolumeSpecName: "config-volume") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.198763 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.198460 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-kube-api-access-89v28" (OuterVolumeSpecName: "kube-api-access-89v28") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "kube-api-access-89v28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:28.198977 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.198800 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.198977 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.198814 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.199092 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.198974 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29805e0-ecc6-4d63-aebe-031e1646b3cd-kube-api-access-w56cd" (OuterVolumeSpecName: "kube-api-access-w56cd") pod "e29805e0-ecc6-4d63-aebe-031e1646b3cd" (UID: "e29805e0-ecc6-4d63-aebe-031e1646b3cd"). InnerVolumeSpecName "kube-api-access-w56cd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:28.199406 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.199382 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:28.199601 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.199564 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-out" (OuterVolumeSpecName: "config-out") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:38:28.199687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.199645 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.200161 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.200141 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e29805e0-ecc6-4d63-aebe-031e1646b3cd" (UID: "e29805e0-ecc6-4d63-aebe-031e1646b3cd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.200505 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.200487 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.204594 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.204569 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.208751 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.208732 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-web-config" (OuterVolumeSpecName: "web-config") pod "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" (UID: "e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:28.299017 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.298986 2572 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-cluster-tls-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299017 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299012 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-oauth-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299017 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299025 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e29805e0-ecc6-4d63-aebe-031e1646b3cd-oauth-serving-cert\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299033 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e29805e0-ecc6-4d63-aebe-031e1646b3cd-console-serving-cert\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299042 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-main-tls\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299051 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-89v28\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-kube-api-access-89v28\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299060 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-tls-assets\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299069 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-web-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299078 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299088 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w56cd\" (UniqueName: \"kubernetes.io/projected/e29805e0-ecc6-4d63-aebe-031e1646b3cd-kube-api-access-w56cd\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299096 2572 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-volume\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299104 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-config-out\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299112 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.299257 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.299120 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.883604 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.883513 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8","Type":"ContainerDied","Data":"eb9ed3546939a7ca3e35f470afc0f59c1fbf5173a950b8d7388874a1bece528a"} Apr 22 17:38:28.883604 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.883561 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:28.884082 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.883563 2572 scope.go:117] "RemoveContainer" containerID="68053045e9b350de236356f912e74a22c8c5d8e1c5f8d30c00b73ccd5b280cfa" Apr 22 17:38:28.885016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.884789 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-847d7bd684-x2fkt_e29805e0-ecc6-4d63-aebe-031e1646b3cd/console/0.log" Apr 22 17:38:28.885016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.884832 2572 generic.go:358] "Generic (PLEG): container finished" podID="e29805e0-ecc6-4d63-aebe-031e1646b3cd" containerID="c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8" exitCode=2 Apr 22 17:38:28.885016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.884868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847d7bd684-x2fkt" event={"ID":"e29805e0-ecc6-4d63-aebe-031e1646b3cd","Type":"ContainerDied","Data":"c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8"} Apr 22 17:38:28.885016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.884892 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847d7bd684-x2fkt" Apr 22 17:38:28.885016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.884892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847d7bd684-x2fkt" event={"ID":"e29805e0-ecc6-4d63-aebe-031e1646b3cd","Type":"ContainerDied","Data":"1027b68e3c1d00cc44496515ea5ff0bdd612524836235b3a9639796ad02fa965"} Apr 22 17:38:28.892543 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.892522 2572 scope.go:117] "RemoveContainer" containerID="a02ed83049c2bea8be6e5c21dda92b70a9305beb72416c87a005a4040af12fbb" Apr 22 17:38:28.899357 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.899343 2572 scope.go:117] "RemoveContainer" containerID="4ef67ebd68307f7af374760666ea396969d1823f08a56cacdee89e149b2a4c91" Apr 22 17:38:28.908052 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.908014 2572 scope.go:117] "RemoveContainer" containerID="cc6994d8486613b07143bdf840ac5d96f849c886d5dfafe8d7af6956ec435b66" Apr 22 17:38:28.909031 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.909007 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-847d7bd684-x2fkt"] Apr 22 17:38:28.912828 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.912806 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-847d7bd684-x2fkt"] Apr 22 17:38:28.915762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.915744 2572 scope.go:117] "RemoveContainer" containerID="36225688c1b61ca2b743e3efe1fd31f1eb329ad19412922887c4ced21f0cef53" Apr 22 17:38:28.922475 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.922455 2572 scope.go:117] "RemoveContainer" containerID="edab840dd9b72c81d6f451eaec2c99ed7cf113a51dc49e13c6cf80b7ff91aee6" Apr 22 17:38:28.925165 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.925147 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:38:28.929174 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.929155 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:38:28.929623 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.929611 2572 scope.go:117] "RemoveContainer" containerID="1c83890e6fa1ff4785d031446cfbd4e62084304f3e5f41e4df55eefd6585b4f6" Apr 22 17:38:28.941011 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.940981 2572 scope.go:117] "RemoveContainer" containerID="c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8" Apr 22 17:38:28.948051 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.948023 2572 scope.go:117] "RemoveContainer" containerID="c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8" Apr 22 17:38:28.948312 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:38:28.948292 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8\": container with ID starting with c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8 not found: ID does not exist" containerID="c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8" Apr 22 17:38:28.948354 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.948323 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8"} err="failed to get container status \"c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8\": rpc error: code = NotFound desc = could not find container \"c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8\": container with ID starting with c6f44547cb6716eb64ead3d0d399f0dc5a9788a424c909d6ed16de6e1ed498e8 not found: ID does not exist" Apr 22 17:38:28.953403 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953380 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:38:28.953672 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953661 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy-web" Apr 22 17:38:28.953738 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953674 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy-web" Apr 22 17:38:28.953738 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953683 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="prom-label-proxy" Apr 22 17:38:28.953738 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953690 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="prom-label-proxy" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953741 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="alertmanager" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953752 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="alertmanager" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953759 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e29805e0-ecc6-4d63-aebe-031e1646b3cd" containerName="console" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953764 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29805e0-ecc6-4d63-aebe-031e1646b3cd" containerName="console" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953774 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy-metric" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953779 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy-metric" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953792 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953797 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953804 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="config-reloader" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953809 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="config-reloader" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953816 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="init-config-reloader" Apr 22 17:38:28.953829 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953821 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="init-config-reloader" Apr 22 17:38:28.954146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953881 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy-metric" Apr 22 17:38:28.954146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953888 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="alertmanager" Apr 22 17:38:28.954146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953895 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy" Apr 22 17:38:28.954146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953902 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="kube-rbac-proxy-web" Apr 22 17:38:28.954146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953910 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e29805e0-ecc6-4d63-aebe-031e1646b3cd" containerName="console" Apr 22 17:38:28.954146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953915 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="prom-label-proxy" Apr 22 17:38:28.954146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.953921 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" containerName="config-reloader" Apr 22 17:38:28.995449 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.995414 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:38:28.995602 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.995567 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:28.998097 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998072 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:38:28.998227 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998073 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:38:28.998227 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998211 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:38:28.998227 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998223 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:38:28.998395 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998293 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:38:28.998509 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998490 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:38:28.998587 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998537 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:38:28.998748 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998730 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-s8drv\"" Apr 22 17:38:28.998880 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:28.998865 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:38:29.002563 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.002542 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:38:29.097556 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.097521 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29805e0-ecc6-4d63-aebe-031e1646b3cd" path="/var/lib/kubelet/pods/e29805e0-ecc6-4d63-aebe-031e1646b3cd/volumes" Apr 22 17:38:29.097925 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.097911 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8" path="/var/lib/kubelet/pods/e9c72a2a-ec47-49ee-9aa8-3deafd8ce2f8/volumes" Apr 22 17:38:29.106208 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91a6df17-c871-4536-9033-6a7971a37d9f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106278 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106219 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106278 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106238 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91a6df17-c871-4536-9033-6a7971a37d9f-config-out\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106278 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106429 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106429 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-web-config\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106429 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91a6df17-c871-4536-9033-6a7971a37d9f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106570 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106570 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbgl5\" (UniqueName: \"kubernetes.io/projected/91a6df17-c871-4536-9033-6a7971a37d9f-kube-api-access-lbgl5\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106570 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106570 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91a6df17-c871-4536-9033-6a7971a37d9f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106766 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106589 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/91a6df17-c871-4536-9033-6a7971a37d9f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.106766 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.106646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-config-volume\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207404 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/91a6df17-c871-4536-9033-6a7971a37d9f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207559 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207424 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-config-volume\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207559 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91a6df17-c871-4536-9033-6a7971a37d9f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207559 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207731 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91a6df17-c871-4536-9033-6a7971a37d9f-config-out\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207694 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/91a6df17-c871-4536-9033-6a7971a37d9f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207937 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-web-config\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207937 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91a6df17-c871-4536-9033-6a7971a37d9f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207937 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.207937 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbgl5\" (UniqueName: \"kubernetes.io/projected/91a6df17-c871-4536-9033-6a7971a37d9f-kube-api-access-lbgl5\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.208146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.207967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.208146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.208013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91a6df17-c871-4536-9033-6a7971a37d9f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.208301 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.208277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91a6df17-c871-4536-9033-6a7971a37d9f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.208986 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.208961 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91a6df17-c871-4536-9033-6a7971a37d9f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.210434 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.210408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91a6df17-c871-4536-9033-6a7971a37d9f-config-out\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.210826 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.210614 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.210826 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.210715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.210826 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.210766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.211253 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.211225 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-config-volume\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.211253 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.211251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.211491 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.211470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.211559 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.211532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91a6df17-c871-4536-9033-6a7971a37d9f-web-config\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.212401 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.212381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91a6df17-c871-4536-9033-6a7971a37d9f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.217186 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.217163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbgl5\" (UniqueName: \"kubernetes.io/projected/91a6df17-c871-4536-9033-6a7971a37d9f-kube-api-access-lbgl5\") pod \"alertmanager-main-0\" (UID: \"91a6df17-c871-4536-9033-6a7971a37d9f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.305988 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.305957 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:38:29.429916 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.429883 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:38:29.433660 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:38:29.433632 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a6df17_c871_4536_9033_6a7971a37d9f.slice/crio-7d3f43e256c250470c59f80e2ca1ca5ce55292739146b5c2671419715f059bc2 WatchSource:0}: Error finding container 7d3f43e256c250470c59f80e2ca1ca5ce55292739146b5c2671419715f059bc2: Status 404 returned error can't find the container with id 7d3f43e256c250470c59f80e2ca1ca5ce55292739146b5c2671419715f059bc2 Apr 22 17:38:29.889239 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.889211 2572 generic.go:358] "Generic (PLEG): container finished" podID="91a6df17-c871-4536-9033-6a7971a37d9f" containerID="1baa3631aab6b8bede8b54f0ae8e3c4759648da5aad1134dd407598e004155ab" exitCode=0 Apr 22 17:38:29.889651 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.889300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"91a6df17-c871-4536-9033-6a7971a37d9f","Type":"ContainerDied","Data":"1baa3631aab6b8bede8b54f0ae8e3c4759648da5aad1134dd407598e004155ab"} Apr 22 17:38:29.889651 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:29.889343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"91a6df17-c871-4536-9033-6a7971a37d9f","Type":"ContainerStarted","Data":"7d3f43e256c250470c59f80e2ca1ca5ce55292739146b5c2671419715f059bc2"} Apr 22 17:38:30.368772 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.368737 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt"] Apr 22 17:38:30.387259 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.387231 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt"] Apr 22 17:38:30.387401 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.387388 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.389813 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.389788 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 17:38:30.389813 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.389807 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-fcgwb\"" Apr 22 17:38:30.390295 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.389788 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 17:38:30.390295 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.389889 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 17:38:30.390295 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.389939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 17:38:30.390295 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.390031 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 17:38:30.395363 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.395140 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 17:38:30.417512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.417454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-telemeter-client-tls\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.417512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.417489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-secret-telemeter-client\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.417512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.417521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-metrics-client-ca\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.417708 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.417644 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-serving-certs-ca-bundle\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.417766 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.417692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-federate-client-tls\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.417766 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.417729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.417839 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.417763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.417839 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.417802 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlflb\" (UniqueName: \"kubernetes.io/projected/79bfd8d1-2166-4874-9a74-b05e77923eae-kube-api-access-vlflb\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.519193 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.519161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.519284 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.519210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlflb\" (UniqueName: \"kubernetes.io/projected/79bfd8d1-2166-4874-9a74-b05e77923eae-kube-api-access-vlflb\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.519284 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.519245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-telemeter-client-tls\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.519284 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.519275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-secret-telemeter-client\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.519438 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.519326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-metrics-client-ca\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.519438 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.519373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-serving-certs-ca-bundle\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.519438 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.519429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-federate-client-tls\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.519586 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.519459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.520106 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.520080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-metrics-client-ca\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.520332 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.520304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.520451 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.520424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfd8d1-2166-4874-9a74-b05e77923eae-serving-certs-ca-bundle\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.521946 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.521919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-telemeter-client-tls\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.522349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.522326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.522393 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.522377 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-secret-telemeter-client\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.522491 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.522473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79bfd8d1-2166-4874-9a74-b05e77923eae-federate-client-tls\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.527019 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.526997 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlflb\" (UniqueName: \"kubernetes.io/projected/79bfd8d1-2166-4874-9a74-b05e77923eae-kube-api-access-vlflb\") pod \"telemeter-client-7d67bb5c78-h5pqt\" (UID: \"79bfd8d1-2166-4874-9a74-b05e77923eae\") " pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.698458 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.698364 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" Apr 22 17:38:30.819455 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.819427 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt"] Apr 22 17:38:30.896726 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.896679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"91a6df17-c871-4536-9033-6a7971a37d9f","Type":"ContainerStarted","Data":"70785b25cdb9916cc59d5704e24c36b0ac14eede0ea733b9e23f3b1cf91d678e"} Apr 22 17:38:30.897128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.896734 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"91a6df17-c871-4536-9033-6a7971a37d9f","Type":"ContainerStarted","Data":"8653db10a6978dde4a82e590cc74d7a83117f11ab6f9ac20a1082e5fded78074"} Apr 22 17:38:30.897128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.896747 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"91a6df17-c871-4536-9033-6a7971a37d9f","Type":"ContainerStarted","Data":"1c9c22c2c50f8fe8fe2f86e9ccc39d1f19ed7f6e0271b6a5f32e0ad2b2b56743"} Apr 22 17:38:30.897128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.896761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"91a6df17-c871-4536-9033-6a7971a37d9f","Type":"ContainerStarted","Data":"50b46ee6c4f58254fbe2a3c06fd48ccdc97451250ab119516d8f44c8bb6fc553"} Apr 22 17:38:30.897128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.896771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"91a6df17-c871-4536-9033-6a7971a37d9f","Type":"ContainerStarted","Data":"276eb2e778143247fc5e473c89a64b33d22d875221cba5897c9f1073c9d1ae2b"} Apr 22 17:38:30.897128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.896780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"91a6df17-c871-4536-9033-6a7971a37d9f","Type":"ContainerStarted","Data":"739b29cd25a0c9889635f316a956799d7d274429e4c62d44746fa4b857c9997f"} Apr 22 17:38:30.897754 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.897732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" event={"ID":"79bfd8d1-2166-4874-9a74-b05e77923eae","Type":"ContainerStarted","Data":"51cca978bd8e68fe8b4124ddc6489232bd3b242be82506d41a84d35e2b84e6a7"} Apr 22 17:38:30.922778 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:30.922346 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.922327964 podStartE2EDuration="2.922327964s" podCreationTimestamp="2026-04-22 17:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:38:30.921101785 +0000 UTC m=+244.400358469" watchObservedRunningTime="2026-04-22 17:38:30.922327964 +0000 UTC m=+244.401584648" Apr 22 17:38:33.910215 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:33.910180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" event={"ID":"79bfd8d1-2166-4874-9a74-b05e77923eae","Type":"ContainerStarted","Data":"e29ca062830424b9225b566b30d9c33c434b1aa0ae42d56611df7ece032e8358"} Apr 22 17:38:33.910215 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:33.910216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" event={"ID":"79bfd8d1-2166-4874-9a74-b05e77923eae","Type":"ContainerStarted","Data":"a822caad2743cf80c3184e049e4cd79543a94944b28ad7f2b6985215ec35a1bf"} Apr 22 17:38:33.910661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:33.910225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" event={"ID":"79bfd8d1-2166-4874-9a74-b05e77923eae","Type":"ContainerStarted","Data":"b396874af9f573df0f3b248890c56dd4e1a8e80c91057a5c669c1b7ebe5b0ad8"} Apr 22 17:38:33.933032 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:33.932980 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7d67bb5c78-h5pqt" podStartSLOduration=1.605253624 podStartE2EDuration="3.932966072s" podCreationTimestamp="2026-04-22 17:38:30 +0000 UTC" firstStartedPulling="2026-04-22 17:38:30.827390358 +0000 UTC m=+244.306647033" lastFinishedPulling="2026-04-22 17:38:33.15510282 +0000 UTC m=+246.634359481" observedRunningTime="2026-04-22 17:38:33.932222852 +0000 UTC m=+247.411479574" watchObservedRunningTime="2026-04-22 17:38:33.932966072 +0000 UTC m=+247.412222785" Apr 22 17:38:34.703336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.703303 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bb4b54cf6-4px5d"] Apr 22 17:38:34.723178 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.723149 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bb4b54cf6-4px5d"] Apr 22 17:38:34.723330 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.723266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.756573 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.756536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-oauth-serving-cert\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.756573 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.756576 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-service-ca\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.756799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.756602 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-trusted-ca-bundle\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.756799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.756654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-oauth-config\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.756799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.756680 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-config\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.756799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.756753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpwrw\" (UniqueName: \"kubernetes.io/projected/d97a52af-4f2e-40cf-9b6c-6363047ec549-kube-api-access-kpwrw\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.756941 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.756771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-serving-cert\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.858568 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.858524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-oauth-serving-cert\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.858568 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.858573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-service-ca\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.858831 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.858589 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-trusted-ca-bundle\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.858831 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.858632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-oauth-config\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.858831 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.858652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-config\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.858831 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.858684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpwrw\" (UniqueName: \"kubernetes.io/projected/d97a52af-4f2e-40cf-9b6c-6363047ec549-kube-api-access-kpwrw\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.859027 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.858931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-serving-cert\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.859283 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.859256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-oauth-serving-cert\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.859399 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.859369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-service-ca\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.859656 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.859638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-config\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.859868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.859849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-trusted-ca-bundle\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.861536 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.861514 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-oauth-config\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.861636 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.861611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-serving-cert\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:34.868099 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:34.868067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpwrw\" (UniqueName: \"kubernetes.io/projected/d97a52af-4f2e-40cf-9b6c-6363047ec549-kube-api-access-kpwrw\") pod \"console-6bb4b54cf6-4px5d\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:35.032458 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:35.032366 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:35.150868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:35.150844 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bb4b54cf6-4px5d"] Apr 22 17:38:35.156144 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:38:35.153795 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd97a52af_4f2e_40cf_9b6c_6363047ec549.slice/crio-27ce9443a3dcee56ee81910ee040ba5fb3b44889daf4a0563637d8e1dada00b5 WatchSource:0}: Error finding container 27ce9443a3dcee56ee81910ee040ba5fb3b44889daf4a0563637d8e1dada00b5: Status 404 returned error can't find the container with id 27ce9443a3dcee56ee81910ee040ba5fb3b44889daf4a0563637d8e1dada00b5 Apr 22 17:38:35.918047 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:35.918012 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb4b54cf6-4px5d" event={"ID":"d97a52af-4f2e-40cf-9b6c-6363047ec549","Type":"ContainerStarted","Data":"9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d"} Apr 22 17:38:35.918047 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:35.918052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb4b54cf6-4px5d" event={"ID":"d97a52af-4f2e-40cf-9b6c-6363047ec549","Type":"ContainerStarted","Data":"27ce9443a3dcee56ee81910ee040ba5fb3b44889daf4a0563637d8e1dada00b5"} Apr 22 17:38:35.936289 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:35.936242 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bb4b54cf6-4px5d" podStartSLOduration=1.936224709 podStartE2EDuration="1.936224709s" podCreationTimestamp="2026-04-22 17:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:38:35.936082099 +0000 UTC m=+249.415338782" watchObservedRunningTime="2026-04-22 17:38:35.936224709 +0000 UTC m=+249.415481396" Apr 22 17:38:38.893072 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:38.893038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:38:38.895327 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:38.895302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c7625b-b71f-4d8d-a883-c465098dbba7-metrics-certs\") pod \"network-metrics-daemon-djttm\" (UID: \"34c7625b-b71f-4d8d-a883-c465098dbba7\") " pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:38:39.002108 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:39.002072 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kkblr\"" Apr 22 17:38:39.008849 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:39.008826 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-djttm" Apr 22 17:38:39.144979 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:39.144896 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-djttm"] Apr 22 17:38:39.148510 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:38:39.148483 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34c7625b_b71f_4d8d_a883_c465098dbba7.slice/crio-3c36271bc43cc5eff12b1157c19768a57224e179d6734b4779da10dcd1fcdbcd WatchSource:0}: Error finding container 3c36271bc43cc5eff12b1157c19768a57224e179d6734b4779da10dcd1fcdbcd: Status 404 returned error can't find the container with id 3c36271bc43cc5eff12b1157c19768a57224e179d6734b4779da10dcd1fcdbcd Apr 22 17:38:39.931710 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:39.931656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-djttm" event={"ID":"34c7625b-b71f-4d8d-a883-c465098dbba7","Type":"ContainerStarted","Data":"3c36271bc43cc5eff12b1157c19768a57224e179d6734b4779da10dcd1fcdbcd"} Apr 22 17:38:40.936365 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:40.936326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-djttm" event={"ID":"34c7625b-b71f-4d8d-a883-c465098dbba7","Type":"ContainerStarted","Data":"39cce99547b0f29f8a7c454b824f7aa932e200545e044dc09a85d3952301bd7e"} Apr 22 17:38:40.936365 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:40.936366 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-djttm" event={"ID":"34c7625b-b71f-4d8d-a883-c465098dbba7","Type":"ContainerStarted","Data":"38d233acd841ed3b8a8c23565c7af69b67c716ca98128c1cf9e142f99c06d723"} Apr 22 17:38:40.954065 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:40.954011 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-djttm" podStartSLOduration=252.967719228 podStartE2EDuration="4m13.953995947s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:38:39.150470093 +0000 UTC m=+252.629726754" lastFinishedPulling="2026-04-22 17:38:40.13674681 +0000 UTC m=+253.616003473" observedRunningTime="2026-04-22 17:38:40.952335009 +0000 UTC m=+254.431591691" watchObservedRunningTime="2026-04-22 17:38:40.953995947 +0000 UTC m=+254.433252652" Apr 22 17:38:45.032711 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:45.032658 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:45.033106 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:45.032742 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:45.037341 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:45.037322 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:45.955691 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:45.955618 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:38:45.998457 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:38:45.998423 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fc87c8d7b-q9g75"] Apr 22 17:39:11.018904 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.018839 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6fc87c8d7b-q9g75" podUID="884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" containerName="console" containerID="cri-o://06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc" gracePeriod=15 Apr 22 17:39:11.260014 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.259983 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fc87c8d7b-q9g75_884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6/console/0.log" Apr 22 17:39:11.260130 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.260048 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:39:11.373118 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373026 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txl29\" (UniqueName: \"kubernetes.io/projected/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-kube-api-access-txl29\") pod \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " Apr 22 17:39:11.373118 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373072 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-config\") pod \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " Apr 22 17:39:11.373118 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373114 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-trusted-ca-bundle\") pod \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " Apr 22 17:39:11.373408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373132 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-service-ca\") pod \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " Apr 22 17:39:11.373408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373163 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-serving-cert\") pod \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " Apr 22 17:39:11.373408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373222 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-oauth-serving-cert\") pod \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " Apr 22 17:39:11.373408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373239 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-oauth-config\") pod \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\" (UID: \"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6\") " Apr 22 17:39:11.373621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373590 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-service-ca" (OuterVolumeSpecName: "service-ca") pod "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" (UID: "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:11.373621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373602 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-config" (OuterVolumeSpecName: "console-config") pod "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" (UID: "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:11.373754 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.373727 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" (UID: "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:11.374123 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.374094 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" (UID: "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:11.375552 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.375512 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-kube-api-access-txl29" (OuterVolumeSpecName: "kube-api-access-txl29") pod "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" (UID: "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6"). InnerVolumeSpecName "kube-api-access-txl29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:39:11.375621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.375555 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" (UID: "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:11.375621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.375578 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" (UID: "884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:11.473982 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.473942 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-txl29\" (UniqueName: \"kubernetes.io/projected/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-kube-api-access-txl29\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:39:11.473982 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.473975 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:39:11.473982 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.473984 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-trusted-ca-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:39:11.473982 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.473993 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-service-ca\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:39:11.474242 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.474003 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-serving-cert\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:39:11.474242 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.474012 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-oauth-serving-cert\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:39:11.474242 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:11.474021 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6-console-oauth-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:39:12.030893 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.030861 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fc87c8d7b-q9g75_884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6/console/0.log" Apr 22 17:39:12.031346 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.030905 2572 generic.go:358] "Generic (PLEG): container finished" podID="884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" containerID="06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc" exitCode=2 Apr 22 17:39:12.031346 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.030971 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fc87c8d7b-q9g75" Apr 22 17:39:12.031346 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.030972 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fc87c8d7b-q9g75" event={"ID":"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6","Type":"ContainerDied","Data":"06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc"} Apr 22 17:39:12.031346 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.031084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fc87c8d7b-q9g75" event={"ID":"884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6","Type":"ContainerDied","Data":"39a9fdf14326ee873da5c94f5bdc13d06195b69dc1c6cfe2ef66cb62a2aa526d"} Apr 22 17:39:12.031346 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.031108 2572 scope.go:117] "RemoveContainer" containerID="06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc" Apr 22 17:39:12.039335 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.039317 2572 scope.go:117] "RemoveContainer" containerID="06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc" Apr 22 17:39:12.039623 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:39:12.039604 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc\": container with ID starting with 06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc not found: ID does not exist" containerID="06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc" Apr 22 17:39:12.039687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.039635 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc"} err="failed to get container status \"06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc\": rpc error: code = NotFound desc = could not find container \"06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc\": container with ID starting with 06347481454d373777f98b9bec32d867020cddc7f1c1ae775013d2689f22cacc not found: ID does not exist" Apr 22 17:39:12.051042 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.051008 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fc87c8d7b-q9g75"] Apr 22 17:39:12.054186 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:12.054159 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fc87c8d7b-q9g75"] Apr 22 17:39:13.098087 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:13.098055 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" path="/var/lib/kubelet/pods/884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6/volumes" Apr 22 17:39:26.983615 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:26.983581 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:39:26.986250 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:26.986220 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:39:26.994157 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:26.994134 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:39:44.119178 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.119140 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6456cb9b5f-jtnsm"] Apr 22 17:39:44.121547 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.119600 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" containerName="console" Apr 22 17:39:44.121547 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.119616 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" containerName="console" Apr 22 17:39:44.121547 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.119679 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="884d8fa6-6f42-4cfe-8f7b-9a97fb26dab6" containerName="console" Apr 22 17:39:44.122341 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.122320 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.132146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.132126 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6456cb9b5f-jtnsm"] Apr 22 17:39:44.222799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.222756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-trusted-ca-bundle\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.222986 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.222835 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-config\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.222986 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.222873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-serving-cert\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.222986 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.222893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-oauth-config\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.222986 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.222910 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-oauth-serving-cert\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.222986 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.222939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-service-ca\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.223229 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.223075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wnl\" (UniqueName: \"kubernetes.io/projected/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-kube-api-access-k5wnl\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.324239 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.324204 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-serving-cert\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.324239 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.324244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-oauth-config\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.324465 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.324260 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-oauth-serving-cert\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.324465 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.324399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-service-ca\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.324573 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.324517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wnl\" (UniqueName: \"kubernetes.io/projected/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-kube-api-access-k5wnl\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.324573 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.324554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-trusted-ca-bundle\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.324682 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.324597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-config\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.325026 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.325000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-oauth-serving-cert\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.325141 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.325090 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-service-ca\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.325296 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.325275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-config\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.325361 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.325344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-trusted-ca-bundle\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.326618 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.326589 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-oauth-config\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.326778 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.326760 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-serving-cert\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.332362 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.332342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wnl\" (UniqueName: \"kubernetes.io/projected/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-kube-api-access-k5wnl\") pod \"console-6456cb9b5f-jtnsm\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.433198 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.433162 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:44.555891 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.555863 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6456cb9b5f-jtnsm"] Apr 22 17:39:44.558379 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:39:44.558345 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ab0422_ac23_4482_a3a5_a71e2c0c8011.slice/crio-3c32cf7cd1ca5ec3fea2d44eeb53056d97049e719b7f63fc630937907664b67a WatchSource:0}: Error finding container 3c32cf7cd1ca5ec3fea2d44eeb53056d97049e719b7f63fc630937907664b67a: Status 404 returned error can't find the container with id 3c32cf7cd1ca5ec3fea2d44eeb53056d97049e719b7f63fc630937907664b67a Apr 22 17:39:44.560287 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:44.560265 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:39:45.129582 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:45.129546 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6456cb9b5f-jtnsm" event={"ID":"a6ab0422-ac23-4482-a3a5-a71e2c0c8011","Type":"ContainerStarted","Data":"60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6"} Apr 22 17:39:45.129582 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:45.129584 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6456cb9b5f-jtnsm" event={"ID":"a6ab0422-ac23-4482-a3a5-a71e2c0c8011","Type":"ContainerStarted","Data":"3c32cf7cd1ca5ec3fea2d44eeb53056d97049e719b7f63fc630937907664b67a"} Apr 22 17:39:45.147791 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:45.147731 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6456cb9b5f-jtnsm" podStartSLOduration=1.147712276 podStartE2EDuration="1.147712276s" podCreationTimestamp="2026-04-22 17:39:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:39:45.146088179 +0000 UTC m=+318.625344863" watchObservedRunningTime="2026-04-22 17:39:45.147712276 +0000 UTC m=+318.626968950" Apr 22 17:39:54.433758 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:54.433728 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:54.434134 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:54.433770 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:54.438370 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:54.438348 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:55.163546 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:55.163518 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:39:55.208894 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:39:55.208864 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bb4b54cf6-4px5d"] Apr 22 17:40:20.229056 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.228947 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bb4b54cf6-4px5d" podUID="d97a52af-4f2e-40cf-9b6c-6363047ec549" containerName="console" containerID="cri-o://9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d" gracePeriod=15 Apr 22 17:40:20.466026 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.466001 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bb4b54cf6-4px5d_d97a52af-4f2e-40cf-9b6c-6363047ec549/console/0.log" Apr 22 17:40:20.466165 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.466066 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:40:20.531387 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531295 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-oauth-serving-cert\") pod \"d97a52af-4f2e-40cf-9b6c-6363047ec549\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " Apr 22 17:40:20.531387 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531333 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpwrw\" (UniqueName: \"kubernetes.io/projected/d97a52af-4f2e-40cf-9b6c-6363047ec549-kube-api-access-kpwrw\") pod \"d97a52af-4f2e-40cf-9b6c-6363047ec549\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " Apr 22 17:40:20.531387 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531359 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-serving-cert\") pod \"d97a52af-4f2e-40cf-9b6c-6363047ec549\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " Apr 22 17:40:20.531387 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531382 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-service-ca\") pod \"d97a52af-4f2e-40cf-9b6c-6363047ec549\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " Apr 22 17:40:20.531655 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531593 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-oauth-config\") pod \"d97a52af-4f2e-40cf-9b6c-6363047ec549\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " Apr 22 17:40:20.531731 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531681 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-trusted-ca-bundle\") pod \"d97a52af-4f2e-40cf-9b6c-6363047ec549\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " Apr 22 17:40:20.531794 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531727 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-config\") pod \"d97a52af-4f2e-40cf-9b6c-6363047ec549\" (UID: \"d97a52af-4f2e-40cf-9b6c-6363047ec549\") " Apr 22 17:40:20.531851 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531828 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-service-ca" (OuterVolumeSpecName: "service-ca") pod "d97a52af-4f2e-40cf-9b6c-6363047ec549" (UID: "d97a52af-4f2e-40cf-9b6c-6363047ec549"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:40:20.531927 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.531826 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d97a52af-4f2e-40cf-9b6c-6363047ec549" (UID: "d97a52af-4f2e-40cf-9b6c-6363047ec549"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:40:20.532121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.532099 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-oauth-serving-cert\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:40:20.532211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.532114 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d97a52af-4f2e-40cf-9b6c-6363047ec549" (UID: "d97a52af-4f2e-40cf-9b6c-6363047ec549"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:40:20.532211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.532127 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-service-ca\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:40:20.532211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.532153 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-config" (OuterVolumeSpecName: "console-config") pod "d97a52af-4f2e-40cf-9b6c-6363047ec549" (UID: "d97a52af-4f2e-40cf-9b6c-6363047ec549"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:40:20.533655 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.533635 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d97a52af-4f2e-40cf-9b6c-6363047ec549" (UID: "d97a52af-4f2e-40cf-9b6c-6363047ec549"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:40:20.534038 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.534014 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d97a52af-4f2e-40cf-9b6c-6363047ec549" (UID: "d97a52af-4f2e-40cf-9b6c-6363047ec549"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:40:20.534114 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.534017 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97a52af-4f2e-40cf-9b6c-6363047ec549-kube-api-access-kpwrw" (OuterVolumeSpecName: "kube-api-access-kpwrw") pod "d97a52af-4f2e-40cf-9b6c-6363047ec549" (UID: "d97a52af-4f2e-40cf-9b6c-6363047ec549"). InnerVolumeSpecName "kube-api-access-kpwrw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:40:20.633275 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.633224 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-trusted-ca-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:40:20.633275 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.633269 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:40:20.633275 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.633281 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpwrw\" (UniqueName: \"kubernetes.io/projected/d97a52af-4f2e-40cf-9b6c-6363047ec549-kube-api-access-kpwrw\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:40:20.633275 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.633291 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-serving-cert\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:40:20.633554 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:20.633300 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97a52af-4f2e-40cf-9b6c-6363047ec549-console-oauth-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:40:21.239091 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.239060 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bb4b54cf6-4px5d_d97a52af-4f2e-40cf-9b6c-6363047ec549/console/0.log" Apr 22 17:40:21.239545 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.239102 2572 generic.go:358] "Generic (PLEG): container finished" podID="d97a52af-4f2e-40cf-9b6c-6363047ec549" containerID="9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d" exitCode=2 Apr 22 17:40:21.239545 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.239146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb4b54cf6-4px5d" event={"ID":"d97a52af-4f2e-40cf-9b6c-6363047ec549","Type":"ContainerDied","Data":"9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d"} Apr 22 17:40:21.239545 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.239174 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb4b54cf6-4px5d" event={"ID":"d97a52af-4f2e-40cf-9b6c-6363047ec549","Type":"ContainerDied","Data":"27ce9443a3dcee56ee81910ee040ba5fb3b44889daf4a0563637d8e1dada00b5"} Apr 22 17:40:21.239545 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.239196 2572 scope.go:117] "RemoveContainer" containerID="9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d" Apr 22 17:40:21.239545 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.239200 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb4b54cf6-4px5d" Apr 22 17:40:21.247800 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.247780 2572 scope.go:117] "RemoveContainer" containerID="9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d" Apr 22 17:40:21.248058 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:40:21.248036 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d\": container with ID starting with 9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d not found: ID does not exist" containerID="9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d" Apr 22 17:40:21.248126 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.248069 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d"} err="failed to get container status \"9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d\": rpc error: code = NotFound desc = could not find container \"9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d\": container with ID starting with 9e0bf24ff1c1793ad7371c25b806d7cc970e69c8eba12c1f8d85c610991d6f0d not found: ID does not exist" Apr 22 17:40:21.257934 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.257908 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bb4b54cf6-4px5d"] Apr 22 17:40:21.261453 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:21.261432 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bb4b54cf6-4px5d"] Apr 22 17:40:23.098325 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:23.098286 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97a52af-4f2e-40cf-9b6c-6363047ec549" path="/var/lib/kubelet/pods/d97a52af-4f2e-40cf-9b6c-6363047ec549/volumes" Apr 22 17:40:28.270200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.270166 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tsg64"] Apr 22 17:40:28.270591 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.270477 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d97a52af-4f2e-40cf-9b6c-6363047ec549" containerName="console" Apr 22 17:40:28.270591 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.270486 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97a52af-4f2e-40cf-9b6c-6363047ec549" containerName="console" Apr 22 17:40:28.270591 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.270535 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d97a52af-4f2e-40cf-9b6c-6363047ec549" containerName="console" Apr 22 17:40:28.274546 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.274529 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.276782 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.276758 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:40:28.280083 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.280062 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tsg64"] Apr 22 17:40:28.292958 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.292923 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-dbus\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.293062 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.292966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-original-pull-secret\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.293062 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.293040 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-kubelet-config\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.394333 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.394297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-dbus\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.394479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.394341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-original-pull-secret\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.394479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.394392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-kubelet-config\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.394591 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.394485 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-kubelet-config\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.394591 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.394497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-dbus\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.399014 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.398986 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fb37d3e6-7be5-4eaa-8699-e8a8a641b235-original-pull-secret\") pod \"global-pull-secret-syncer-tsg64\" (UID: \"fb37d3e6-7be5-4eaa-8699-e8a8a641b235\") " pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.585188 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.585101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tsg64" Apr 22 17:40:28.706103 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:28.706059 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tsg64"] Apr 22 17:40:28.708593 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:40:28.708568 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb37d3e6_7be5_4eaa_8699_e8a8a641b235.slice/crio-ecbfae7280e434ce94f86edde164ff7c579f44dcc3dc0b41f16f157955482f16 WatchSource:0}: Error finding container ecbfae7280e434ce94f86edde164ff7c579f44dcc3dc0b41f16f157955482f16: Status 404 returned error can't find the container with id ecbfae7280e434ce94f86edde164ff7c579f44dcc3dc0b41f16f157955482f16 Apr 22 17:40:29.268607 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:29.268564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tsg64" event={"ID":"fb37d3e6-7be5-4eaa-8699-e8a8a641b235","Type":"ContainerStarted","Data":"ecbfae7280e434ce94f86edde164ff7c579f44dcc3dc0b41f16f157955482f16"} Apr 22 17:40:33.282245 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:33.282211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tsg64" event={"ID":"fb37d3e6-7be5-4eaa-8699-e8a8a641b235","Type":"ContainerStarted","Data":"3d4b02d0b99ee330200f52e6dca619f142a0ff29d92f3e48b18da412d408a38e"} Apr 22 17:40:33.297011 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:33.296966 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tsg64" podStartSLOduration=1.4240173870000001 podStartE2EDuration="5.296952675s" podCreationTimestamp="2026-04-22 17:40:28 +0000 UTC" firstStartedPulling="2026-04-22 17:40:28.710338578 +0000 UTC m=+362.189595238" lastFinishedPulling="2026-04-22 17:40:32.583273866 +0000 UTC m=+366.062530526" observedRunningTime="2026-04-22 17:40:33.295790842 +0000 UTC m=+366.775047529" watchObservedRunningTime="2026-04-22 17:40:33.296952675 +0000 UTC m=+366.776209357" Apr 22 17:40:49.203418 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.203380 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst"] Apr 22 17:40:49.207432 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.207416 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.209645 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.209603 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:40:49.209645 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.209641 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:40:49.210429 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.210416 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vtcg6\"" Apr 22 17:40:49.214989 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.214957 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst"] Apr 22 17:40:49.285952 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.285909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.286153 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.286010 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.286153 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.286073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lqk\" (UniqueName: \"kubernetes.io/projected/bc5f6125-5fae-454d-94f8-0a79e7181e14-kube-api-access-l7lqk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.387022 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.386985 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.387196 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.387044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.387196 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.387078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lqk\" (UniqueName: \"kubernetes.io/projected/bc5f6125-5fae-454d-94f8-0a79e7181e14-kube-api-access-l7lqk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.387415 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.387394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.387454 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.387412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.395736 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.395684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lqk\" (UniqueName: \"kubernetes.io/projected/bc5f6125-5fae-454d-94f8-0a79e7181e14-kube-api-access-l7lqk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.517053 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.516964 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:40:49.632250 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:49.632221 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst"] Apr 22 17:40:49.634741 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:40:49.634713 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc5f6125_5fae_454d_94f8_0a79e7181e14.slice/crio-030223e8c60d0550158080d0695659964b4004d53c61bbf52ca225230dcc4b07 WatchSource:0}: Error finding container 030223e8c60d0550158080d0695659964b4004d53c61bbf52ca225230dcc4b07: Status 404 returned error can't find the container with id 030223e8c60d0550158080d0695659964b4004d53c61bbf52ca225230dcc4b07 Apr 22 17:40:50.336058 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:50.336017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" event={"ID":"bc5f6125-5fae-454d-94f8-0a79e7181e14","Type":"ContainerStarted","Data":"030223e8c60d0550158080d0695659964b4004d53c61bbf52ca225230dcc4b07"} Apr 22 17:40:55.352651 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:55.352550 2572 generic.go:358] "Generic (PLEG): container finished" podID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerID="f6dbd8a8cde513a51482f7b06121f4fc7217bc3741f82a93e1228c01d08f49ce" exitCode=0 Apr 22 17:40:55.352651 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:55.352639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" event={"ID":"bc5f6125-5fae-454d-94f8-0a79e7181e14","Type":"ContainerDied","Data":"f6dbd8a8cde513a51482f7b06121f4fc7217bc3741f82a93e1228c01d08f49ce"} Apr 22 17:40:57.359788 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:57.359753 2572 generic.go:358] "Generic (PLEG): container finished" podID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerID="59c570587c1fb92c379924d4776a57659217acd5dee4abbf905964d68d5ba792" exitCode=0 Apr 22 17:40:57.360175 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:40:57.359823 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" event={"ID":"bc5f6125-5fae-454d-94f8-0a79e7181e14","Type":"ContainerDied","Data":"59c570587c1fb92c379924d4776a57659217acd5dee4abbf905964d68d5ba792"} Apr 22 17:41:04.384013 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:04.383985 2572 generic.go:358] "Generic (PLEG): container finished" podID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerID="bb4109decc4549c06d74fc59797ac5158c63f02ae00c3dd29b7b773c2a7ea348" exitCode=0 Apr 22 17:41:04.384271 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:04.384026 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" event={"ID":"bc5f6125-5fae-454d-94f8-0a79e7181e14","Type":"ContainerDied","Data":"bb4109decc4549c06d74fc59797ac5158c63f02ae00c3dd29b7b773c2a7ea348"} Apr 22 17:41:05.505147 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.505124 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:41:05.635229 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.635189 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7lqk\" (UniqueName: \"kubernetes.io/projected/bc5f6125-5fae-454d-94f8-0a79e7181e14-kube-api-access-l7lqk\") pod \"bc5f6125-5fae-454d-94f8-0a79e7181e14\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " Apr 22 17:41:05.635405 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.635254 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-bundle\") pod \"bc5f6125-5fae-454d-94f8-0a79e7181e14\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " Apr 22 17:41:05.635405 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.635344 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-util\") pod \"bc5f6125-5fae-454d-94f8-0a79e7181e14\" (UID: \"bc5f6125-5fae-454d-94f8-0a79e7181e14\") " Apr 22 17:41:05.635853 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.635830 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-bundle" (OuterVolumeSpecName: "bundle") pod "bc5f6125-5fae-454d-94f8-0a79e7181e14" (UID: "bc5f6125-5fae-454d-94f8-0a79e7181e14"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:41:05.637445 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.637370 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5f6125-5fae-454d-94f8-0a79e7181e14-kube-api-access-l7lqk" (OuterVolumeSpecName: "kube-api-access-l7lqk") pod "bc5f6125-5fae-454d-94f8-0a79e7181e14" (UID: "bc5f6125-5fae-454d-94f8-0a79e7181e14"). InnerVolumeSpecName "kube-api-access-l7lqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:41:05.640766 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.640741 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-util" (OuterVolumeSpecName: "util") pod "bc5f6125-5fae-454d-94f8-0a79e7181e14" (UID: "bc5f6125-5fae-454d-94f8-0a79e7181e14"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:41:05.736210 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.736179 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:41:05.736210 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.736206 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc5f6125-5fae-454d-94f8-0a79e7181e14-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:41:05.736210 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:05.736215 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7lqk\" (UniqueName: \"kubernetes.io/projected/bc5f6125-5fae-454d-94f8-0a79e7181e14-kube-api-access-l7lqk\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:41:06.391381 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:06.391354 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" Apr 22 17:41:06.391535 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:06.391354 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29chpfst" event={"ID":"bc5f6125-5fae-454d-94f8-0a79e7181e14","Type":"ContainerDied","Data":"030223e8c60d0550158080d0695659964b4004d53c61bbf52ca225230dcc4b07"} Apr 22 17:41:06.391535 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:06.391467 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="030223e8c60d0550158080d0695659964b4004d53c61bbf52ca225230dcc4b07" Apr 22 17:41:10.840980 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.840944 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2"] Apr 22 17:41:10.841350 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.841258 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerName="util" Apr 22 17:41:10.841350 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.841268 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerName="util" Apr 22 17:41:10.841350 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.841279 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerName="extract" Apr 22 17:41:10.841350 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.841285 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerName="extract" Apr 22 17:41:10.841350 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.841294 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerName="pull" Apr 22 17:41:10.841350 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.841299 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerName="pull" Apr 22 17:41:10.841537 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.841369 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc5f6125-5fae-454d-94f8-0a79e7181e14" containerName="extract" Apr 22 17:41:10.889355 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.889320 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2"] Apr 22 17:41:10.889505 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.889399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:10.891947 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.891911 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 17:41:10.891947 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.891926 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-4d2q9\"" Apr 22 17:41:10.892138 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.891947 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 17:41:10.892138 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.891953 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 17:41:10.979972 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.979929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmbfl\" (UniqueName: \"kubernetes.io/projected/2acc8124-5bcd-436a-b870-98db4f897cf4-kube-api-access-hmbfl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2\" (UID: \"2acc8124-5bcd-436a-b870-98db4f897cf4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:10.980158 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:10.979986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2acc8124-5bcd-436a-b870-98db4f897cf4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2\" (UID: \"2acc8124-5bcd-436a-b870-98db4f897cf4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:11.080756 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:11.080723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbfl\" (UniqueName: \"kubernetes.io/projected/2acc8124-5bcd-436a-b870-98db4f897cf4-kube-api-access-hmbfl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2\" (UID: \"2acc8124-5bcd-436a-b870-98db4f897cf4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:11.080917 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:11.080779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2acc8124-5bcd-436a-b870-98db4f897cf4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2\" (UID: \"2acc8124-5bcd-436a-b870-98db4f897cf4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:11.083176 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:11.083152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/2acc8124-5bcd-436a-b870-98db4f897cf4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2\" (UID: \"2acc8124-5bcd-436a-b870-98db4f897cf4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:11.090662 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:11.090638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbfl\" (UniqueName: \"kubernetes.io/projected/2acc8124-5bcd-436a-b870-98db4f897cf4-kube-api-access-hmbfl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2\" (UID: \"2acc8124-5bcd-436a-b870-98db4f897cf4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:11.198888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:11.198862 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:11.323476 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:11.323358 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2"] Apr 22 17:41:11.326198 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:41:11.326167 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2acc8124_5bcd_436a_b870_98db4f897cf4.slice/crio-d383a02a7e79cce006d16edab5312dfc65e5be2cc99514f87f6d82a6bb1b1e80 WatchSource:0}: Error finding container d383a02a7e79cce006d16edab5312dfc65e5be2cc99514f87f6d82a6bb1b1e80: Status 404 returned error can't find the container with id d383a02a7e79cce006d16edab5312dfc65e5be2cc99514f87f6d82a6bb1b1e80 Apr 22 17:41:11.405815 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:11.405786 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" event={"ID":"2acc8124-5bcd-436a-b870-98db4f897cf4","Type":"ContainerStarted","Data":"d383a02a7e79cce006d16edab5312dfc65e5be2cc99514f87f6d82a6bb1b1e80"} Apr 22 17:41:18.368184 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.368149 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2f6md"] Apr 22 17:41:18.371686 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.371667 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.373775 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.373752 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-75qrh\"" Apr 22 17:41:18.373857 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.373758 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 17:41:18.373857 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.373757 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 17:41:18.381008 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.380974 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2f6md"] Apr 22 17:41:18.432188 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.432158 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" event={"ID":"2acc8124-5bcd-436a-b870-98db4f897cf4","Type":"ContainerStarted","Data":"77135b92fb736f8d32b224251dccf1eb5beede7723287da6cd9dc37ca5aab46e"} Apr 22 17:41:18.432371 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.432285 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:18.445332 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.445308 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.445463 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.445358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsnrt\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-kube-api-access-xsnrt\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.445463 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.445418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/393f3374-2cc7-46aa-bec5-6e59161305ef-cabundle0\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.452753 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.452688 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" podStartSLOduration=1.9514631119999999 podStartE2EDuration="8.452672846s" podCreationTimestamp="2026-04-22 17:41:10 +0000 UTC" firstStartedPulling="2026-04-22 17:41:11.328008595 +0000 UTC m=+404.807265255" lastFinishedPulling="2026-04-22 17:41:17.829218326 +0000 UTC m=+411.308474989" observedRunningTime="2026-04-22 17:41:18.451051675 +0000 UTC m=+411.930308357" watchObservedRunningTime="2026-04-22 17:41:18.452672846 +0000 UTC m=+411.931929529" Apr 22 17:41:18.546266 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.546231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xsnrt\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-kube-api-access-xsnrt\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.546438 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.546299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/393f3374-2cc7-46aa-bec5-6e59161305ef-cabundle0\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.546438 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.546346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.546558 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:18.546475 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:41:18.546558 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:18.546496 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:41:18.546558 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:18.546508 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2f6md: references non-existent secret key: ca.crt Apr 22 17:41:18.546673 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:18.546580 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates podName:393f3374-2cc7-46aa-bec5-6e59161305ef nodeName:}" failed. No retries permitted until 2026-04-22 17:41:19.046559908 +0000 UTC m=+412.525816589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates") pod "keda-operator-ffbb595cb-2f6md" (UID: "393f3374-2cc7-46aa-bec5-6e59161305ef") : references non-existent secret key: ca.crt Apr 22 17:41:18.546988 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.546968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/393f3374-2cc7-46aa-bec5-6e59161305ef-cabundle0\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.554290 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.554263 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsnrt\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-kube-api-access-xsnrt\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:18.647684 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.647643 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn"] Apr 22 17:41:18.651343 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.651321 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.653296 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.653268 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 17:41:18.660879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.660848 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn"] Apr 22 17:41:18.748096 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.748067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b9e14249-be02-4cb5-808f-598266c9be5d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.748096 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.748099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pj9\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-kube-api-access-m5pj9\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.748358 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.748127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.849731 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.849680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b9e14249-be02-4cb5-808f-598266c9be5d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.849922 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.849742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5pj9\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-kube-api-access-m5pj9\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.849922 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.849786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.850076 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:18.849947 2572 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:41:18.850076 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:18.849965 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:41:18.850076 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:18.849984 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn: references non-existent secret key: tls.crt Apr 22 17:41:18.850076 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:18.850037 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates podName:b9e14249-be02-4cb5-808f-598266c9be5d nodeName:}" failed. No retries permitted until 2026-04-22 17:41:19.350021227 +0000 UTC m=+412.829277887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates") pod "keda-metrics-apiserver-7c9f485588-mmpwn" (UID: "b9e14249-be02-4cb5-808f-598266c9be5d") : references non-existent secret key: tls.crt Apr 22 17:41:18.850076 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.850048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b9e14249-be02-4cb5-808f-598266c9be5d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.862562 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.862534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5pj9\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-kube-api-access-m5pj9\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:18.953842 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.953765 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-96zph"] Apr 22 17:41:18.957409 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.957394 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:18.959690 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.959669 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 17:41:18.967225 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:18.967199 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-96zph"] Apr 22 17:41:19.052313 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.052279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/68ad63d2-3ef4-4e65-8c39-e860f5b921a3-certificates\") pod \"keda-admission-cf49989db-96zph\" (UID: \"68ad63d2-3ef4-4e65-8c39-e860f5b921a3\") " pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:19.052476 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.052335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:19.052476 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.052377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pjmz\" (UniqueName: \"kubernetes.io/projected/68ad63d2-3ef4-4e65-8c39-e860f5b921a3-kube-api-access-6pjmz\") pod \"keda-admission-cf49989db-96zph\" (UID: \"68ad63d2-3ef4-4e65-8c39-e860f5b921a3\") " pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:19.052550 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:19.052494 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:41:19.052550 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:19.052510 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:41:19.052550 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:19.052518 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2f6md: references non-existent secret key: ca.crt Apr 22 17:41:19.052645 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:19.052566 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates podName:393f3374-2cc7-46aa-bec5-6e59161305ef nodeName:}" failed. No retries permitted until 2026-04-22 17:41:20.05254921 +0000 UTC m=+413.531805878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates") pod "keda-operator-ffbb595cb-2f6md" (UID: "393f3374-2cc7-46aa-bec5-6e59161305ef") : references non-existent secret key: ca.crt Apr 22 17:41:19.153168 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.153139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pjmz\" (UniqueName: \"kubernetes.io/projected/68ad63d2-3ef4-4e65-8c39-e860f5b921a3-kube-api-access-6pjmz\") pod \"keda-admission-cf49989db-96zph\" (UID: \"68ad63d2-3ef4-4e65-8c39-e860f5b921a3\") " pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:19.153331 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.153193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/68ad63d2-3ef4-4e65-8c39-e860f5b921a3-certificates\") pod \"keda-admission-cf49989db-96zph\" (UID: \"68ad63d2-3ef4-4e65-8c39-e860f5b921a3\") " pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:19.155604 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.155579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/68ad63d2-3ef4-4e65-8c39-e860f5b921a3-certificates\") pod \"keda-admission-cf49989db-96zph\" (UID: \"68ad63d2-3ef4-4e65-8c39-e860f5b921a3\") " pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:19.160440 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.160413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pjmz\" (UniqueName: \"kubernetes.io/projected/68ad63d2-3ef4-4e65-8c39-e860f5b921a3-kube-api-access-6pjmz\") pod \"keda-admission-cf49989db-96zph\" (UID: \"68ad63d2-3ef4-4e65-8c39-e860f5b921a3\") " pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:19.268535 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.268466 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:19.355373 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.355333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:19.355602 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:19.355583 2572 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:41:19.355602 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:19.355602 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:41:19.355762 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:19.355620 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn: references non-existent secret key: tls.crt Apr 22 17:41:19.355762 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:19.355674 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates podName:b9e14249-be02-4cb5-808f-598266c9be5d nodeName:}" failed. No retries permitted until 2026-04-22 17:41:20.355659275 +0000 UTC m=+413.834915936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates") pod "keda-metrics-apiserver-7c9f485588-mmpwn" (UID: "b9e14249-be02-4cb5-808f-598266c9be5d") : references non-existent secret key: tls.crt Apr 22 17:41:19.391992 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.391968 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-96zph"] Apr 22 17:41:19.394789 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:41:19.394758 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68ad63d2_3ef4_4e65_8c39_e860f5b921a3.slice/crio-ed883236c3d5f9fef495d8025129bb48384eb35873f8f9de64f1525651af06dc WatchSource:0}: Error finding container ed883236c3d5f9fef495d8025129bb48384eb35873f8f9de64f1525651af06dc: Status 404 returned error can't find the container with id ed883236c3d5f9fef495d8025129bb48384eb35873f8f9de64f1525651af06dc Apr 22 17:41:19.439029 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:19.438998 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-96zph" event={"ID":"68ad63d2-3ef4-4e65-8c39-e860f5b921a3","Type":"ContainerStarted","Data":"ed883236c3d5f9fef495d8025129bb48384eb35873f8f9de64f1525651af06dc"} Apr 22 17:41:20.062554 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:20.062521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:20.062756 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:20.062690 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:41:20.062756 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:20.062724 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:41:20.062756 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:20.062736 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2f6md: references non-existent secret key: ca.crt Apr 22 17:41:20.062886 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:20.062787 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates podName:393f3374-2cc7-46aa-bec5-6e59161305ef nodeName:}" failed. No retries permitted until 2026-04-22 17:41:22.062771961 +0000 UTC m=+415.542028620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates") pod "keda-operator-ffbb595cb-2f6md" (UID: "393f3374-2cc7-46aa-bec5-6e59161305ef") : references non-existent secret key: ca.crt Apr 22 17:41:20.364316 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:20.364223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:20.364534 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:20.364380 2572 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:41:20.364534 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:20.364403 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:41:20.364534 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:20.364425 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn: references non-existent secret key: tls.crt Apr 22 17:41:20.364534 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:41:20.364516 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates podName:b9e14249-be02-4cb5-808f-598266c9be5d nodeName:}" failed. No retries permitted until 2026-04-22 17:41:22.364496628 +0000 UTC m=+415.843753290 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates") pod "keda-metrics-apiserver-7c9f485588-mmpwn" (UID: "b9e14249-be02-4cb5-808f-598266c9be5d") : references non-existent secret key: tls.crt Apr 22 17:41:21.446738 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:21.446683 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-96zph" event={"ID":"68ad63d2-3ef4-4e65-8c39-e860f5b921a3","Type":"ContainerStarted","Data":"6d0e33a2dd3df229dd08b5b6fc5d4f7299024caeb409298bdffdd5d1a783e197"} Apr 22 17:41:21.447209 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:21.446815 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:21.465490 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:21.465443 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-96zph" podStartSLOduration=1.902687743 podStartE2EDuration="3.465431798s" podCreationTimestamp="2026-04-22 17:41:18 +0000 UTC" firstStartedPulling="2026-04-22 17:41:19.396395175 +0000 UTC m=+412.875651835" lastFinishedPulling="2026-04-22 17:41:20.959139231 +0000 UTC m=+414.438395890" observedRunningTime="2026-04-22 17:41:21.463831405 +0000 UTC m=+414.943088086" watchObservedRunningTime="2026-04-22 17:41:21.465431798 +0000 UTC m=+414.944688480" Apr 22 17:41:22.081207 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.081167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:22.083600 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.083582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/393f3374-2cc7-46aa-bec5-6e59161305ef-certificates\") pod \"keda-operator-ffbb595cb-2f6md\" (UID: \"393f3374-2cc7-46aa-bec5-6e59161305ef\") " pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:22.282359 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.282323 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:22.383304 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.383270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:22.385678 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.385654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9e14249-be02-4cb5-808f-598266c9be5d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-mmpwn\" (UID: \"b9e14249-be02-4cb5-808f-598266c9be5d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:22.399231 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.399211 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2f6md"] Apr 22 17:41:22.400896 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:41:22.400871 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod393f3374_2cc7_46aa_bec5_6e59161305ef.slice/crio-c3385e98685b0a6229db51c3f490cef8260ef536ac12997b06ccf6bc33a1ca90 WatchSource:0}: Error finding container c3385e98685b0a6229db51c3f490cef8260ef536ac12997b06ccf6bc33a1ca90: Status 404 returned error can't find the container with id c3385e98685b0a6229db51c3f490cef8260ef536ac12997b06ccf6bc33a1ca90 Apr 22 17:41:22.451230 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.451202 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2f6md" event={"ID":"393f3374-2cc7-46aa-bec5-6e59161305ef","Type":"ContainerStarted","Data":"c3385e98685b0a6229db51c3f490cef8260ef536ac12997b06ccf6bc33a1ca90"} Apr 22 17:41:22.564395 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.564371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:22.687055 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:22.687019 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn"] Apr 22 17:41:22.689974 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:41:22.689939 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e14249_be02_4cb5_808f_598266c9be5d.slice/crio-46c37095ddba0fb01acdf70e8a68da3b10ee1549327d899dd6e88a0ee726d7ca WatchSource:0}: Error finding container 46c37095ddba0fb01acdf70e8a68da3b10ee1549327d899dd6e88a0ee726d7ca: Status 404 returned error can't find the container with id 46c37095ddba0fb01acdf70e8a68da3b10ee1549327d899dd6e88a0ee726d7ca Apr 22 17:41:23.455283 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:23.455247 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" event={"ID":"b9e14249-be02-4cb5-808f-598266c9be5d","Type":"ContainerStarted","Data":"46c37095ddba0fb01acdf70e8a68da3b10ee1549327d899dd6e88a0ee726d7ca"} Apr 22 17:41:26.468851 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:26.468804 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2f6md" event={"ID":"393f3374-2cc7-46aa-bec5-6e59161305ef","Type":"ContainerStarted","Data":"f263483c359a4c8453464d56c873c47a198776f49d3b259e673a5c455c985547"} Apr 22 17:41:26.469305 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:26.468928 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:41:26.470172 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:26.470152 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" event={"ID":"b9e14249-be02-4cb5-808f-598266c9be5d","Type":"ContainerStarted","Data":"88be292f496e12f72314ede348233334912ce010815fc2e2ca1982b009bf32e2"} Apr 22 17:41:26.470294 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:26.470283 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:26.485175 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:26.485079 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-2f6md" podStartSLOduration=4.6637093499999995 podStartE2EDuration="8.485067233s" podCreationTimestamp="2026-04-22 17:41:18 +0000 UTC" firstStartedPulling="2026-04-22 17:41:22.402244931 +0000 UTC m=+415.881501590" lastFinishedPulling="2026-04-22 17:41:26.223602809 +0000 UTC m=+419.702859473" observedRunningTime="2026-04-22 17:41:26.484305468 +0000 UTC m=+419.963562149" watchObservedRunningTime="2026-04-22 17:41:26.485067233 +0000 UTC m=+419.964323916" Apr 22 17:41:26.507481 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:26.502612 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" podStartSLOduration=4.972456483 podStartE2EDuration="8.502597416s" podCreationTimestamp="2026-04-22 17:41:18 +0000 UTC" firstStartedPulling="2026-04-22 17:41:22.693922085 +0000 UTC m=+416.173178745" lastFinishedPulling="2026-04-22 17:41:26.224063018 +0000 UTC m=+419.703319678" observedRunningTime="2026-04-22 17:41:26.501889905 +0000 UTC m=+419.981146589" watchObservedRunningTime="2026-04-22 17:41:26.502597416 +0000 UTC m=+419.981854099" Apr 22 17:41:37.478580 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:37.478544 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-mmpwn" Apr 22 17:41:39.441116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:39.441087 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n4xx2" Apr 22 17:41:42.453224 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:42.453195 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-96zph" Apr 22 17:41:47.476656 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:41:47.476585 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-2f6md" Apr 22 17:42:09.131076 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.131043 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv"] Apr 22 17:42:09.142223 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.142199 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv"] Apr 22 17:42:09.142385 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.142302 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.144469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.144448 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:42:09.144592 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.144453 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:42:09.145283 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.145265 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vtcg6\"" Apr 22 17:42:09.259839 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.259810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.259999 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.259854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8b4\" (UniqueName: \"kubernetes.io/projected/ec59044e-b266-452e-9b38-c5d7c7d234db-kube-api-access-nf8b4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.259999 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.259935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.361055 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.361018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.361238 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.361094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.361238 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.361115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8b4\" (UniqueName: \"kubernetes.io/projected/ec59044e-b266-452e-9b38-c5d7c7d234db-kube-api-access-nf8b4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.361399 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.361378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.361466 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.361446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.368893 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.368862 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8b4\" (UniqueName: \"kubernetes.io/projected/ec59044e-b266-452e-9b38-c5d7c7d234db-kube-api-access-nf8b4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.451814 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.451790 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:09.566491 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.566467 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv"] Apr 22 17:42:09.568611 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:42:09.568587 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec59044e_b266_452e_9b38_c5d7c7d234db.slice/crio-2f6bb2200d1268119f07be7b260be0a7eb87baaab93d519b3b8bf3711f56e3be WatchSource:0}: Error finding container 2f6bb2200d1268119f07be7b260be0a7eb87baaab93d519b3b8bf3711f56e3be: Status 404 returned error can't find the container with id 2f6bb2200d1268119f07be7b260be0a7eb87baaab93d519b3b8bf3711f56e3be Apr 22 17:42:09.613125 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:09.613090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" event={"ID":"ec59044e-b266-452e-9b38-c5d7c7d234db","Type":"ContainerStarted","Data":"2f6bb2200d1268119f07be7b260be0a7eb87baaab93d519b3b8bf3711f56e3be"} Apr 22 17:42:10.617258 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:10.617222 2572 generic.go:358] "Generic (PLEG): container finished" podID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerID="ec615efdd7d34b080f01a95614c4f21c695f14a635649b3efef551df2081ac40" exitCode=0 Apr 22 17:42:10.617653 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:10.617313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" event={"ID":"ec59044e-b266-452e-9b38-c5d7c7d234db","Type":"ContainerDied","Data":"ec615efdd7d34b080f01a95614c4f21c695f14a635649b3efef551df2081ac40"} Apr 22 17:42:14.631796 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:14.631759 2572 generic.go:358] "Generic (PLEG): container finished" podID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerID="1d7cf7ddab8578499f6aa847f05c4f539da21a69aac776d127b0dc183dbe8cb5" exitCode=0 Apr 22 17:42:14.632182 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:14.631822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" event={"ID":"ec59044e-b266-452e-9b38-c5d7c7d234db","Type":"ContainerDied","Data":"1d7cf7ddab8578499f6aa847f05c4f539da21a69aac776d127b0dc183dbe8cb5"} Apr 22 17:42:15.637152 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:15.637116 2572 generic.go:358] "Generic (PLEG): container finished" podID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerID="1a5fa702153ddb08032de47094db8edd0e60da0ddd7d5eebcc78abe273bd5c45" exitCode=0 Apr 22 17:42:15.637529 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:15.637159 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" event={"ID":"ec59044e-b266-452e-9b38-c5d7c7d234db","Type":"ContainerDied","Data":"1a5fa702153ddb08032de47094db8edd0e60da0ddd7d5eebcc78abe273bd5c45"} Apr 22 17:42:16.766101 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.766074 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:16.826474 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.826440 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-bundle\") pod \"ec59044e-b266-452e-9b38-c5d7c7d234db\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " Apr 22 17:42:16.826649 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.826532 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-util\") pod \"ec59044e-b266-452e-9b38-c5d7c7d234db\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " Apr 22 17:42:16.826649 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.826565 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf8b4\" (UniqueName: \"kubernetes.io/projected/ec59044e-b266-452e-9b38-c5d7c7d234db-kube-api-access-nf8b4\") pod \"ec59044e-b266-452e-9b38-c5d7c7d234db\" (UID: \"ec59044e-b266-452e-9b38-c5d7c7d234db\") " Apr 22 17:42:16.827230 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.827204 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-bundle" (OuterVolumeSpecName: "bundle") pod "ec59044e-b266-452e-9b38-c5d7c7d234db" (UID: "ec59044e-b266-452e-9b38-c5d7c7d234db"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:42:16.828625 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.828597 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec59044e-b266-452e-9b38-c5d7c7d234db-kube-api-access-nf8b4" (OuterVolumeSpecName: "kube-api-access-nf8b4") pod "ec59044e-b266-452e-9b38-c5d7c7d234db" (UID: "ec59044e-b266-452e-9b38-c5d7c7d234db"). InnerVolumeSpecName "kube-api-access-nf8b4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:42:16.832413 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.832383 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-util" (OuterVolumeSpecName: "util") pod "ec59044e-b266-452e-9b38-c5d7c7d234db" (UID: "ec59044e-b266-452e-9b38-c5d7c7d234db"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:42:16.927682 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.927594 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:42:16.927682 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.927626 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nf8b4\" (UniqueName: \"kubernetes.io/projected/ec59044e-b266-452e-9b38-c5d7c7d234db-kube-api-access-nf8b4\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:42:16.927682 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:16.927638 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec59044e-b266-452e-9b38-c5d7c7d234db-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:42:17.644858 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:17.644813 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" event={"ID":"ec59044e-b266-452e-9b38-c5d7c7d234db","Type":"ContainerDied","Data":"2f6bb2200d1268119f07be7b260be0a7eb87baaab93d519b3b8bf3711f56e3be"} Apr 22 17:42:17.644858 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:17.644858 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f6bb2200d1268119f07be7b260be0a7eb87baaab93d519b3b8bf3711f56e3be" Apr 22 17:42:17.644858 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:17.644863 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dlhlsv" Apr 22 17:42:31.071247 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.071202 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq"] Apr 22 17:42:31.071750 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.071731 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerName="extract" Apr 22 17:42:31.071842 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.071753 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerName="extract" Apr 22 17:42:31.071842 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.071769 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerName="util" Apr 22 17:42:31.071842 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.071778 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerName="util" Apr 22 17:42:31.071842 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.071802 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerName="pull" Apr 22 17:42:31.071842 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.071810 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerName="pull" Apr 22 17:42:31.072073 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.071912 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec59044e-b266-452e-9b38-c5d7c7d234db" containerName="extract" Apr 22 17:42:31.074269 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.074247 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.076855 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.076830 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vtcg6\"" Apr 22 17:42:31.076970 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.076876 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:42:31.076970 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.076890 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:42:31.084996 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.084972 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq"] Apr 22 17:42:31.253806 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.253757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7r5m\" (UniqueName: \"kubernetes.io/projected/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-kube-api-access-x7r5m\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.253806 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.253814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.254021 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.253872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.355199 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.355088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7r5m\" (UniqueName: \"kubernetes.io/projected/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-kube-api-access-x7r5m\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.355199 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.355152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.355199 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.355193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.355600 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.355547 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.355661 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.355616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.362835 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.362817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7r5m\" (UniqueName: \"kubernetes.io/projected/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-kube-api-access-x7r5m\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.383377 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.383351 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:31.508318 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.508278 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq"] Apr 22 17:42:31.511299 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:42:31.511270 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa5b8bc9_fe6c_4edf_990b_8025e33af3e6.slice/crio-6afe9389d87d4c83460d413d3a743cbafbe7acd4ef0bac6d6800edd814978b33 WatchSource:0}: Error finding container 6afe9389d87d4c83460d413d3a743cbafbe7acd4ef0bac6d6800edd814978b33: Status 404 returned error can't find the container with id 6afe9389d87d4c83460d413d3a743cbafbe7acd4ef0bac6d6800edd814978b33 Apr 22 17:42:31.694080 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.694049 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerID="98ff02302c0150f8eb7ad72086f2ff3b0c589a6462c72eae4691a049ca2c51d3" exitCode=0 Apr 22 17:42:31.694266 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.694126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" event={"ID":"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6","Type":"ContainerDied","Data":"98ff02302c0150f8eb7ad72086f2ff3b0c589a6462c72eae4691a049ca2c51d3"} Apr 22 17:42:31.694266 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:31.694149 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" event={"ID":"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6","Type":"ContainerStarted","Data":"6afe9389d87d4c83460d413d3a743cbafbe7acd4ef0bac6d6800edd814978b33"} Apr 22 17:42:34.705674 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:34.705642 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerID="9f7de8ceb486c75a3e22b72c66fd1546b974ab9f4799cf656cfce7a8df5c8cbd" exitCode=0 Apr 22 17:42:34.706099 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:34.705717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" event={"ID":"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6","Type":"ContainerDied","Data":"9f7de8ceb486c75a3e22b72c66fd1546b974ab9f4799cf656cfce7a8df5c8cbd"} Apr 22 17:42:35.710479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:35.710442 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerID="aeeb962be69f32d0f1cc0ced4a069368854da213627863fa777f24ae4e9ef510" exitCode=0 Apr 22 17:42:35.710952 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:35.710487 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" event={"ID":"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6","Type":"ContainerDied","Data":"aeeb962be69f32d0f1cc0ced4a069368854da213627863fa777f24ae4e9ef510"} Apr 22 17:42:36.831844 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:36.831820 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:37.006403 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.006308 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-util\") pod \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " Apr 22 17:42:37.006403 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.006374 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7r5m\" (UniqueName: \"kubernetes.io/projected/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-kube-api-access-x7r5m\") pod \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " Apr 22 17:42:37.006633 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.006424 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-bundle\") pod \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\" (UID: \"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6\") " Apr 22 17:42:37.006867 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.006843 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-bundle" (OuterVolumeSpecName: "bundle") pod "fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" (UID: "fa5b8bc9-fe6c-4edf-990b-8025e33af3e6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:42:37.008482 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.008440 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-kube-api-access-x7r5m" (OuterVolumeSpecName: "kube-api-access-x7r5m") pod "fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" (UID: "fa5b8bc9-fe6c-4edf-990b-8025e33af3e6"). InnerVolumeSpecName "kube-api-access-x7r5m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:42:37.011298 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.011272 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-util" (OuterVolumeSpecName: "util") pod "fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" (UID: "fa5b8bc9-fe6c-4edf-990b-8025e33af3e6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:42:37.107358 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.107335 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:42:37.107358 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.107356 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:42:37.107500 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.107365 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x7r5m\" (UniqueName: \"kubernetes.io/projected/fa5b8bc9-fe6c-4edf-990b-8025e33af3e6-kube-api-access-x7r5m\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:42:37.721263 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.721226 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" event={"ID":"fa5b8bc9-fe6c-4edf-990b-8025e33af3e6","Type":"ContainerDied","Data":"6afe9389d87d4c83460d413d3a743cbafbe7acd4ef0bac6d6800edd814978b33"} Apr 22 17:42:37.721263 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.721268 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6afe9389d87d4c83460d413d3a743cbafbe7acd4ef0bac6d6800edd814978b33" Apr 22 17:42:37.721469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:37.721241 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7r4rq" Apr 22 17:42:43.874232 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.874194 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78"] Apr 22 17:42:43.874634 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.874573 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerName="util" Apr 22 17:42:43.874634 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.874585 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerName="util" Apr 22 17:42:43.874634 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.874594 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerName="extract" Apr 22 17:42:43.874634 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.874600 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerName="extract" Apr 22 17:42:43.874634 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.874618 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerName="pull" Apr 22 17:42:43.874634 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.874624 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerName="pull" Apr 22 17:42:43.874848 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.874683 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa5b8bc9-fe6c-4edf-990b-8025e33af3e6" containerName="extract" Apr 22 17:42:43.880505 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.880488 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" Apr 22 17:42:43.883026 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.883007 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 17:42:43.883129 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.883021 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:42:43.883750 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.883733 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-xv92k\"" Apr 22 17:42:43.888800 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.888779 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78"] Apr 22 17:42:43.962037 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.961999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002-tmp\") pod \"openshift-lws-operator-bfc7f696d-5dz78\" (UID: \"64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" Apr 22 17:42:43.962216 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:43.962081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgxp\" (UniqueName: \"kubernetes.io/projected/64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002-kube-api-access-rwgxp\") pod \"openshift-lws-operator-bfc7f696d-5dz78\" (UID: \"64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" Apr 22 17:42:44.063458 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:44.063428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002-tmp\") pod \"openshift-lws-operator-bfc7f696d-5dz78\" (UID: \"64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" Apr 22 17:42:44.063637 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:44.063478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgxp\" (UniqueName: \"kubernetes.io/projected/64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002-kube-api-access-rwgxp\") pod \"openshift-lws-operator-bfc7f696d-5dz78\" (UID: \"64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" Apr 22 17:42:44.063819 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:44.063798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002-tmp\") pod \"openshift-lws-operator-bfc7f696d-5dz78\" (UID: \"64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" Apr 22 17:42:44.074069 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:44.074040 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgxp\" (UniqueName: \"kubernetes.io/projected/64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002-kube-api-access-rwgxp\") pod \"openshift-lws-operator-bfc7f696d-5dz78\" (UID: \"64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" Apr 22 17:42:44.199127 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:44.199090 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" Apr 22 17:42:44.318301 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:44.318274 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78"] Apr 22 17:42:44.320990 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:42:44.320962 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64fc1c68_9769_4b9c_b1c9_fcd2f3fe8002.slice/crio-8413208fdbc2039ee2163ed0fc1fe555ec8a3df199f05fd49777f435597ee06c WatchSource:0}: Error finding container 8413208fdbc2039ee2163ed0fc1fe555ec8a3df199f05fd49777f435597ee06c: Status 404 returned error can't find the container with id 8413208fdbc2039ee2163ed0fc1fe555ec8a3df199f05fd49777f435597ee06c Apr 22 17:42:44.743974 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:44.743936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" event={"ID":"64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002","Type":"ContainerStarted","Data":"8413208fdbc2039ee2163ed0fc1fe555ec8a3df199f05fd49777f435597ee06c"} Apr 22 17:42:46.753211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:46.753171 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" event={"ID":"64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002","Type":"ContainerStarted","Data":"6b97a6a5dd94873baf2f4b6db9467deb2acc927505c5e4cd114047312b0f0361"} Apr 22 17:42:46.769394 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:42:46.769342 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5dz78" podStartSLOduration=1.463755561 podStartE2EDuration="3.76932785s" podCreationTimestamp="2026-04-22 17:42:43 +0000 UTC" firstStartedPulling="2026-04-22 17:42:44.322334419 +0000 UTC m=+497.801591078" lastFinishedPulling="2026-04-22 17:42:46.627906707 +0000 UTC m=+500.107163367" observedRunningTime="2026-04-22 17:42:46.767337135 +0000 UTC m=+500.246593832" watchObservedRunningTime="2026-04-22 17:42:46.76932785 +0000 UTC m=+500.248584533" Apr 22 17:43:01.250189 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.250148 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw"] Apr 22 17:43:01.253692 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.253675 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.256069 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.256045 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:43:01.256186 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.256074 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:43:01.256818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.256794 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vtcg6\"" Apr 22 17:43:01.259617 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.259596 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw"] Apr 22 17:43:01.315042 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.315011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.315209 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.315055 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmb7t\" (UniqueName: \"kubernetes.io/projected/ce223593-725a-427c-a5b3-7c06299df189-kube-api-access-gmb7t\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.315209 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.315147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.416473 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.416426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.416473 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.416486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmb7t\" (UniqueName: \"kubernetes.io/projected/ce223593-725a-427c-a5b3-7c06299df189-kube-api-access-gmb7t\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.416729 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.416523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.416924 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.416901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.416995 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.416910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.424495 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.424470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmb7t\" (UniqueName: \"kubernetes.io/projected/ce223593-725a-427c-a5b3-7c06299df189-kube-api-access-gmb7t\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.563926 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.563840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:01.685298 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.685273 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw"] Apr 22 17:43:01.687873 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:01.687841 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce223593_725a_427c_a5b3_7c06299df189.slice/crio-4a3d57dd219509da9c210667ce8fb157bde1deeef3e30db91ee6ac784fc78c04 WatchSource:0}: Error finding container 4a3d57dd219509da9c210667ce8fb157bde1deeef3e30db91ee6ac784fc78c04: Status 404 returned error can't find the container with id 4a3d57dd219509da9c210667ce8fb157bde1deeef3e30db91ee6ac784fc78c04 Apr 22 17:43:01.806081 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.806047 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce223593-725a-427c-a5b3-7c06299df189" containerID="612612cd845eaaf3b9086790837718facacf60fb11a549db480f28c00c9c8d69" exitCode=0 Apr 22 17:43:01.806245 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.806133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" event={"ID":"ce223593-725a-427c-a5b3-7c06299df189","Type":"ContainerDied","Data":"612612cd845eaaf3b9086790837718facacf60fb11a549db480f28c00c9c8d69"} Apr 22 17:43:01.806245 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:01.806169 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" event={"ID":"ce223593-725a-427c-a5b3-7c06299df189","Type":"ContainerStarted","Data":"4a3d57dd219509da9c210667ce8fb157bde1deeef3e30db91ee6ac784fc78c04"} Apr 22 17:43:02.811539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:02.811449 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce223593-725a-427c-a5b3-7c06299df189" containerID="c4eaf98111b203da11dd7f3a56773beee42c39a97cf098c7cd3099648ee5e991" exitCode=0 Apr 22 17:43:02.811539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:02.811488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" event={"ID":"ce223593-725a-427c-a5b3-7c06299df189","Type":"ContainerDied","Data":"c4eaf98111b203da11dd7f3a56773beee42c39a97cf098c7cd3099648ee5e991"} Apr 22 17:43:03.816447 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:03.816414 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce223593-725a-427c-a5b3-7c06299df189" containerID="c46e7457c07e347f6aeb96aed7402780ff694112d29df46fb5295e17576bcb37" exitCode=0 Apr 22 17:43:03.816916 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:03.816485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" event={"ID":"ce223593-725a-427c-a5b3-7c06299df189","Type":"ContainerDied","Data":"c46e7457c07e347f6aeb96aed7402780ff694112d29df46fb5295e17576bcb37"} Apr 22 17:43:04.946167 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:04.946143 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:05.045370 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.045336 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-util\") pod \"ce223593-725a-427c-a5b3-7c06299df189\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " Apr 22 17:43:05.045525 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.045394 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-bundle\") pod \"ce223593-725a-427c-a5b3-7c06299df189\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " Apr 22 17:43:05.045525 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.045439 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmb7t\" (UniqueName: \"kubernetes.io/projected/ce223593-725a-427c-a5b3-7c06299df189-kube-api-access-gmb7t\") pod \"ce223593-725a-427c-a5b3-7c06299df189\" (UID: \"ce223593-725a-427c-a5b3-7c06299df189\") " Apr 22 17:43:05.046303 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.046276 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-bundle" (OuterVolumeSpecName: "bundle") pod "ce223593-725a-427c-a5b3-7c06299df189" (UID: "ce223593-725a-427c-a5b3-7c06299df189"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:05.047659 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.047627 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce223593-725a-427c-a5b3-7c06299df189-kube-api-access-gmb7t" (OuterVolumeSpecName: "kube-api-access-gmb7t") pod "ce223593-725a-427c-a5b3-7c06299df189" (UID: "ce223593-725a-427c-a5b3-7c06299df189"). InnerVolumeSpecName "kube-api-access-gmb7t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:43:05.050457 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.050422 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-util" (OuterVolumeSpecName: "util") pod "ce223593-725a-427c-a5b3-7c06299df189" (UID: "ce223593-725a-427c-a5b3-7c06299df189"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:05.146580 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.146545 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:05.146580 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.146574 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce223593-725a-427c-a5b3-7c06299df189-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:05.146580 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.146585 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmb7t\" (UniqueName: \"kubernetes.io/projected/ce223593-725a-427c-a5b3-7c06299df189-kube-api-access-gmb7t\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:05.831443 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.831414 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" Apr 22 17:43:05.831621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.831404 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k4fvw" event={"ID":"ce223593-725a-427c-a5b3-7c06299df189","Type":"ContainerDied","Data":"4a3d57dd219509da9c210667ce8fb157bde1deeef3e30db91ee6ac784fc78c04"} Apr 22 17:43:05.831621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:05.831528 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a3d57dd219509da9c210667ce8fb157bde1deeef3e30db91ee6ac784fc78c04" Apr 22 17:43:13.685605 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.685568 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn"] Apr 22 17:43:13.686084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.685930 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce223593-725a-427c-a5b3-7c06299df189" containerName="pull" Apr 22 17:43:13.686084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.685943 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce223593-725a-427c-a5b3-7c06299df189" containerName="pull" Apr 22 17:43:13.686084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.685953 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce223593-725a-427c-a5b3-7c06299df189" containerName="extract" Apr 22 17:43:13.686084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.685958 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce223593-725a-427c-a5b3-7c06299df189" containerName="extract" Apr 22 17:43:13.686084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.685966 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce223593-725a-427c-a5b3-7c06299df189" containerName="util" Apr 22 17:43:13.686084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.685972 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce223593-725a-427c-a5b3-7c06299df189" containerName="util" Apr 22 17:43:13.686084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.686031 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce223593-725a-427c-a5b3-7c06299df189" containerName="extract" Apr 22 17:43:13.690477 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.690458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.692805 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.692782 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:43:13.692934 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.692782 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vtcg6\"" Apr 22 17:43:13.692934 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.692782 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:43:13.697180 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.697157 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn"] Apr 22 17:43:13.822925 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.822892 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.823091 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.822944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/33b2c354-ca37-48ec-8711-fdfeb1a3782b-kube-api-access-qcgz9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.823091 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.823034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.924416 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.924378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/33b2c354-ca37-48ec-8711-fdfeb1a3782b-kube-api-access-qcgz9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.924604 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.924522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.924671 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.924632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.924947 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.924926 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.925038 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.924951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:13.944596 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.944532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/33b2c354-ca37-48ec-8711-fdfeb1a3782b-kube-api-access-qcgz9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:14.000026 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:13.999974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:14.128868 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:14.128842 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn"] Apr 22 17:43:14.131004 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:14.130974 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b2c354_ca37_48ec_8711_fdfeb1a3782b.slice/crio-99a64e96c109b0d469667e0b4976a0d41c08e7b9f4d8cb616830eec114b48324 WatchSource:0}: Error finding container 99a64e96c109b0d469667e0b4976a0d41c08e7b9f4d8cb616830eec114b48324: Status 404 returned error can't find the container with id 99a64e96c109b0d469667e0b4976a0d41c08e7b9f4d8cb616830eec114b48324 Apr 22 17:43:14.864019 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:14.863989 2572 generic.go:358] "Generic (PLEG): container finished" podID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerID="29bb5dc4f228591e9bf92bd9ea1037f30d96cb34666ade64ed4f6b217f1e1d55" exitCode=0 Apr 22 17:43:14.864413 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:14.864047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" event={"ID":"33b2c354-ca37-48ec-8711-fdfeb1a3782b","Type":"ContainerDied","Data":"29bb5dc4f228591e9bf92bd9ea1037f30d96cb34666ade64ed4f6b217f1e1d55"} Apr 22 17:43:14.864413 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:14.864088 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" event={"ID":"33b2c354-ca37-48ec-8711-fdfeb1a3782b","Type":"ContainerStarted","Data":"99a64e96c109b0d469667e0b4976a0d41c08e7b9f4d8cb616830eec114b48324"} Apr 22 17:43:15.309263 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.309225 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg"] Apr 22 17:43:15.312785 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.312768 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:15.314979 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.314939 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 17:43:15.315091 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.315025 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-zh7cd\"" Apr 22 17:43:15.315337 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.315321 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 17:43:15.323183 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.323157 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg"] Apr 22 17:43:15.436160 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.436119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g6gjg\" (UID: \"cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:15.436363 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.436177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lww4z\" (UniqueName: \"kubernetes.io/projected/cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9-kube-api-access-lww4z\") pod \"servicemesh-operator3-55f49c5f94-g6gjg\" (UID: \"cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:15.537229 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.537187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lww4z\" (UniqueName: \"kubernetes.io/projected/cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9-kube-api-access-lww4z\") pod \"servicemesh-operator3-55f49c5f94-g6gjg\" (UID: \"cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:15.537407 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.537276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g6gjg\" (UID: \"cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:15.540098 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.540067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g6gjg\" (UID: \"cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:15.546095 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.546067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lww4z\" (UniqueName: \"kubernetes.io/projected/cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9-kube-api-access-lww4z\") pod \"servicemesh-operator3-55f49c5f94-g6gjg\" (UID: \"cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:15.622471 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.622363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:15.751217 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.751190 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg"] Apr 22 17:43:15.753855 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:15.753821 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc5dc1ec_e621_47bc_b0ba_4e2b808b8bc9.slice/crio-eff64b31a2fa3395c381a91ee6683fb5b7760a31846989b9e838ef751d4ff5a6 WatchSource:0}: Error finding container eff64b31a2fa3395c381a91ee6683fb5b7760a31846989b9e838ef751d4ff5a6: Status 404 returned error can't find the container with id eff64b31a2fa3395c381a91ee6683fb5b7760a31846989b9e838ef751d4ff5a6 Apr 22 17:43:15.869405 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.869372 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" event={"ID":"cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9","Type":"ContainerStarted","Data":"eff64b31a2fa3395c381a91ee6683fb5b7760a31846989b9e838ef751d4ff5a6"} Apr 22 17:43:15.870892 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.870868 2572 generic.go:358] "Generic (PLEG): container finished" podID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerID="49e23373ed9901f81b008b61e86aa05cd8f26d2e66590d1cd93940a42cf86156" exitCode=0 Apr 22 17:43:15.871011 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:15.870945 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" event={"ID":"33b2c354-ca37-48ec-8711-fdfeb1a3782b","Type":"ContainerDied","Data":"49e23373ed9901f81b008b61e86aa05cd8f26d2e66590d1cd93940a42cf86156"} Apr 22 17:43:16.878989 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:16.878945 2572 generic.go:358] "Generic (PLEG): container finished" podID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerID="0d188ca64c8764a940f375df956776cffe0bf6f6a5ba46dce63770b78927c4c3" exitCode=0 Apr 22 17:43:16.879443 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:16.879013 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" event={"ID":"33b2c354-ca37-48ec-8711-fdfeb1a3782b","Type":"ContainerDied","Data":"0d188ca64c8764a940f375df956776cffe0bf6f6a5ba46dce63770b78927c4c3"} Apr 22 17:43:18.783537 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.783508 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:18.874453 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.874430 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/33b2c354-ca37-48ec-8711-fdfeb1a3782b-kube-api-access-qcgz9\") pod \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " Apr 22 17:43:18.874541 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.874466 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-util\") pod \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " Apr 22 17:43:18.874613 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.874594 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-bundle\") pod \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\" (UID: \"33b2c354-ca37-48ec-8711-fdfeb1a3782b\") " Apr 22 17:43:18.875834 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.875804 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-bundle" (OuterVolumeSpecName: "bundle") pod "33b2c354-ca37-48ec-8711-fdfeb1a3782b" (UID: "33b2c354-ca37-48ec-8711-fdfeb1a3782b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:18.876735 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.876711 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b2c354-ca37-48ec-8711-fdfeb1a3782b-kube-api-access-qcgz9" (OuterVolumeSpecName: "kube-api-access-qcgz9") pod "33b2c354-ca37-48ec-8711-fdfeb1a3782b" (UID: "33b2c354-ca37-48ec-8711-fdfeb1a3782b"). InnerVolumeSpecName "kube-api-access-qcgz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:43:18.882118 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.882094 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-util" (OuterVolumeSpecName: "util") pod "33b2c354-ca37-48ec-8711-fdfeb1a3782b" (UID: "33b2c354-ca37-48ec-8711-fdfeb1a3782b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:18.891587 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.891563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" event={"ID":"33b2c354-ca37-48ec-8711-fdfeb1a3782b","Type":"ContainerDied","Data":"99a64e96c109b0d469667e0b4976a0d41c08e7b9f4d8cb616830eec114b48324"} Apr 22 17:43:18.891670 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.891592 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a64e96c109b0d469667e0b4976a0d41c08e7b9f4d8cb616830eec114b48324" Apr 22 17:43:18.891670 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.891596 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebwtfrn" Apr 22 17:43:18.975521 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.975485 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:18.975521 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.975522 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qcgz9\" (UniqueName: \"kubernetes.io/projected/33b2c354-ca37-48ec-8711-fdfeb1a3782b-kube-api-access-qcgz9\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:18.975763 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:18.975538 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/33b2c354-ca37-48ec-8711-fdfeb1a3782b-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:19.897444 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:19.897406 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" event={"ID":"cc5dc1ec-e621-47bc-b0ba-4e2b808b8bc9","Type":"ContainerStarted","Data":"fc8d6fe22dc94b624898bf4c51f789eef3c0d5b8d9ffb9afcb904e4a869a025f"} Apr 22 17:43:19.897945 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:19.897526 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:19.916898 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:19.916851 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" podStartSLOduration=1.841241794 podStartE2EDuration="4.916836503s" podCreationTimestamp="2026-04-22 17:43:15 +0000 UTC" firstStartedPulling="2026-04-22 17:43:15.7562168 +0000 UTC m=+529.235473473" lastFinishedPulling="2026-04-22 17:43:18.831811519 +0000 UTC m=+532.311068182" observedRunningTime="2026-04-22 17:43:19.914332898 +0000 UTC m=+533.393589580" watchObservedRunningTime="2026-04-22 17:43:19.916836503 +0000 UTC m=+533.396093185" Apr 22 17:43:20.396068 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.396038 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-959c974c-qb4dq"] Apr 22 17:43:20.396410 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.396397 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerName="extract" Apr 22 17:43:20.396460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.396412 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerName="extract" Apr 22 17:43:20.396460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.396427 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerName="pull" Apr 22 17:43:20.396460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.396432 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerName="pull" Apr 22 17:43:20.396460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.396449 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerName="util" Apr 22 17:43:20.396460 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.396454 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerName="util" Apr 22 17:43:20.396629 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.396507 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="33b2c354-ca37-48ec-8711-fdfeb1a3782b" containerName="extract" Apr 22 17:43:20.399572 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.399553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.402328 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.402307 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 17:43:20.402565 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.402550 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 17:43:20.402651 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.402583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-9qxl4\"" Apr 22 17:43:20.402841 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.402826 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 17:43:20.427617 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.427589 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-959c974c-qb4dq"] Apr 22 17:43:20.490691 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.490648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59ab3137-77a7-4d89-9a7d-31b1045686f1-cert\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.490894 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.490725 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/59ab3137-77a7-4d89-9a7d-31b1045686f1-metrics-cert\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.490894 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.490789 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/59ab3137-77a7-4d89-9a7d-31b1045686f1-manager-config\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.490894 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.490866 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdtz\" (UniqueName: \"kubernetes.io/projected/59ab3137-77a7-4d89-9a7d-31b1045686f1-kube-api-access-6vdtz\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.591823 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.591789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59ab3137-77a7-4d89-9a7d-31b1045686f1-cert\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.591823 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.591831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/59ab3137-77a7-4d89-9a7d-31b1045686f1-metrics-cert\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.592071 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.591873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/59ab3137-77a7-4d89-9a7d-31b1045686f1-manager-config\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.592071 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.591905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vdtz\" (UniqueName: \"kubernetes.io/projected/59ab3137-77a7-4d89-9a7d-31b1045686f1-kube-api-access-6vdtz\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.592551 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.592525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/59ab3137-77a7-4d89-9a7d-31b1045686f1-manager-config\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.594363 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.594341 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/59ab3137-77a7-4d89-9a7d-31b1045686f1-metrics-cert\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.594472 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.594388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59ab3137-77a7-4d89-9a7d-31b1045686f1-cert\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.600467 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.600446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vdtz\" (UniqueName: \"kubernetes.io/projected/59ab3137-77a7-4d89-9a7d-31b1045686f1-kube-api-access-6vdtz\") pod \"lws-controller-manager-959c974c-qb4dq\" (UID: \"59ab3137-77a7-4d89-9a7d-31b1045686f1\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.709603 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.709513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:20.837751 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.837720 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-959c974c-qb4dq"] Apr 22 17:43:20.839559 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:20.839532 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ab3137_77a7_4d89_9a7d_31b1045686f1.slice/crio-c8913823a8c2b9b8536e39a76c489f54796ad6dc6a43a40e46aac4d65b81554a WatchSource:0}: Error finding container c8913823a8c2b9b8536e39a76c489f54796ad6dc6a43a40e46aac4d65b81554a: Status 404 returned error can't find the container with id c8913823a8c2b9b8536e39a76c489f54796ad6dc6a43a40e46aac4d65b81554a Apr 22 17:43:20.902316 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:20.902278 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" event={"ID":"59ab3137-77a7-4d89-9a7d-31b1045686f1","Type":"ContainerStarted","Data":"c8913823a8c2b9b8536e39a76c489f54796ad6dc6a43a40e46aac4d65b81554a"} Apr 22 17:43:22.912305 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:22.912261 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" event={"ID":"59ab3137-77a7-4d89-9a7d-31b1045686f1","Type":"ContainerStarted","Data":"d688914c047b6082134bd6fbb07b07109351147ae015447d56301cc66a9cf53f"} Apr 22 17:43:22.912688 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:22.912315 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:22.932516 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:22.932464 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" podStartSLOduration=1.322976872 podStartE2EDuration="2.932448815s" podCreationTimestamp="2026-04-22 17:43:20 +0000 UTC" firstStartedPulling="2026-04-22 17:43:20.841405867 +0000 UTC m=+534.320662527" lastFinishedPulling="2026-04-22 17:43:22.450877811 +0000 UTC m=+535.930134470" observedRunningTime="2026-04-22 17:43:22.930830293 +0000 UTC m=+536.410087007" watchObservedRunningTime="2026-04-22 17:43:22.932448815 +0000 UTC m=+536.411705496" Apr 22 17:43:30.906821 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:30.906792 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g6gjg" Apr 22 17:43:33.918041 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:33.918006 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-959c974c-qb4dq" Apr 22 17:43:35.873960 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.873927 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59c5574b9d-hkr9q"] Apr 22 17:43:35.879367 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.879344 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:35.885823 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.885795 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c5574b9d-hkr9q"] Apr 22 17:43:35.911799 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.911771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-oauth-serving-cert\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:35.911959 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.911806 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-service-ca\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:35.911959 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.911871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-config\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:35.911959 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.911912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-serving-cert\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:35.911959 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.911950 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-oauth-config\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:35.912083 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.911990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-trusted-ca-bundle\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:35.912083 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:35.912026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcj7g\" (UniqueName: \"kubernetes.io/projected/5d4de06e-f975-47e8-9a8e-9cfa268d7968-kube-api-access-xcj7g\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.012479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.012435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-config\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.012479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.012480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-serving-cert\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.012777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.012510 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-oauth-config\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.012777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.012530 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-trusted-ca-bundle\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.012777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.012552 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcj7g\" (UniqueName: \"kubernetes.io/projected/5d4de06e-f975-47e8-9a8e-9cfa268d7968-kube-api-access-xcj7g\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.012777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.012580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-oauth-serving-cert\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.012777 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.012607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-service-ca\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.013350 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.013321 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-config\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.013489 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.013321 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-oauth-serving-cert\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.013489 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.013394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-service-ca\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.013604 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.013510 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d4de06e-f975-47e8-9a8e-9cfa268d7968-trusted-ca-bundle\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.015561 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.015529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-oauth-config\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.015561 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.015543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4de06e-f975-47e8-9a8e-9cfa268d7968-console-serving-cert\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.021681 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.021653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcj7g\" (UniqueName: \"kubernetes.io/projected/5d4de06e-f975-47e8-9a8e-9cfa268d7968-kube-api-access-xcj7g\") pod \"console-59c5574b9d-hkr9q\" (UID: \"5d4de06e-f975-47e8-9a8e-9cfa268d7968\") " pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.190505 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.190473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:36.326184 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.326155 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c5574b9d-hkr9q"] Apr 22 17:43:36.327919 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:36.327893 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d4de06e_f975_47e8_9a8e_9cfa268d7968.slice/crio-e029b7c628a0928ba20e0daebf0bb87859a0be0983a896ef36c53095bfddfb25 WatchSource:0}: Error finding container e029b7c628a0928ba20e0daebf0bb87859a0be0983a896ef36c53095bfddfb25: Status 404 returned error can't find the container with id e029b7c628a0928ba20e0daebf0bb87859a0be0983a896ef36c53095bfddfb25 Apr 22 17:43:36.966512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.966480 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c5574b9d-hkr9q" event={"ID":"5d4de06e-f975-47e8-9a8e-9cfa268d7968","Type":"ContainerStarted","Data":"d99b5304804fda058d2cd1a0b7304a5ec6ce72cb4ee4cc911f75a6c577c36cc4"} Apr 22 17:43:36.966512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.966516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c5574b9d-hkr9q" event={"ID":"5d4de06e-f975-47e8-9a8e-9cfa268d7968","Type":"ContainerStarted","Data":"e029b7c628a0928ba20e0daebf0bb87859a0be0983a896ef36c53095bfddfb25"} Apr 22 17:43:36.986377 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:36.986322 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59c5574b9d-hkr9q" podStartSLOduration=1.986302984 podStartE2EDuration="1.986302984s" podCreationTimestamp="2026-04-22 17:43:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:43:36.984818259 +0000 UTC m=+550.464074943" watchObservedRunningTime="2026-04-22 17:43:36.986302984 +0000 UTC m=+550.465559667" Apr 22 17:43:41.021742 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.021707 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw"] Apr 22 17:43:41.025729 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.025689 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.028750 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.028732 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vtcg6\"" Apr 22 17:43:41.029111 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.029096 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:43:41.029477 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.029455 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:43:41.035973 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.035948 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw"] Apr 22 17:43:41.059198 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.059172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bjjx\" (UniqueName: \"kubernetes.io/projected/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-kube-api-access-5bjjx\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.059336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.059218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.059336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.059237 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.121108 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.121076 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp"] Apr 22 17:43:41.124674 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.124657 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.131577 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.131543 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp"] Apr 22 17:43:41.160375 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.160338 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.160375 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.160383 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.160595 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.160428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4mmn\" (UniqueName: \"kubernetes.io/projected/a7f1601e-5d2f-41c5-b2c7-832099f71969-kube-api-access-j4mmn\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.160595 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.160460 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.160595 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.160503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.160595 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.160535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bjjx\" (UniqueName: \"kubernetes.io/projected/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-kube-api-access-5bjjx\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.160754 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.160732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.160810 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.160793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.168756 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.168726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bjjx\" (UniqueName: \"kubernetes.io/projected/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-kube-api-access-5bjjx\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.218893 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.218863 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v"] Apr 22 17:43:41.222587 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.222570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.229940 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.229915 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v"] Apr 22 17:43:41.261388 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.261356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4mmn\" (UniqueName: \"kubernetes.io/projected/a7f1601e-5d2f-41c5-b2c7-832099f71969-kube-api-access-j4mmn\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.261564 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.261400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.261564 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.261421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.261564 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.261458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.261564 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.261479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjk77\" (UniqueName: \"kubernetes.io/projected/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-kube-api-access-pjk77\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.261564 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.261522 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.261862 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.261839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.261895 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.261851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.270520 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.270497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4mmn\" (UniqueName: \"kubernetes.io/projected/a7f1601e-5d2f-41c5-b2c7-832099f71969-kube-api-access-j4mmn\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.323751 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.323658 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh"] Apr 22 17:43:41.331469 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.331443 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.336067 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.336036 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:41.341146 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.341123 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh"] Apr 22 17:43:41.362040 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.362005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.362211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.362058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.362211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.362167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.362306 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.362220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.362306 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.362252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v79rf\" (UniqueName: \"kubernetes.io/projected/29d49d4a-eee7-469d-bc31-2ebb98badfa5-kube-api-access-v79rf\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.362408 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.362335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjk77\" (UniqueName: \"kubernetes.io/projected/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-kube-api-access-pjk77\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.362454 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.362415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.362640 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.362613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.371077 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.371047 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjk77\" (UniqueName: \"kubernetes.io/projected/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-kube-api-access-pjk77\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.435267 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.435232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:41.463844 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.463790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.464063 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.463856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v79rf\" (UniqueName: \"kubernetes.io/projected/29d49d4a-eee7-469d-bc31-2ebb98badfa5-kube-api-access-v79rf\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.464063 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.463957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.464362 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.464314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.464433 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.464397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.465273 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.465246 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw"] Apr 22 17:43:41.467570 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:41.467547 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75a9f445_a61c_4ec1_b5cb_19eefbce24fd.slice/crio-a0bd5389dca7493cc3651f2a5ecc6961c5027c2c16f952ba7f7259ab7208fa73 WatchSource:0}: Error finding container a0bd5389dca7493cc3651f2a5ecc6961c5027c2c16f952ba7f7259ab7208fa73: Status 404 returned error can't find the container with id a0bd5389dca7493cc3651f2a5ecc6961c5027c2c16f952ba7f7259ab7208fa73 Apr 22 17:43:41.471973 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.471953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v79rf\" (UniqueName: \"kubernetes.io/projected/29d49d4a-eee7-469d-bc31-2ebb98badfa5-kube-api-access-v79rf\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.532791 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.532760 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:41.574425 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.574398 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp"] Apr 22 17:43:41.577273 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:41.577233 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f1601e_5d2f_41c5_b2c7_832099f71969.slice/crio-90d54eb03885b24698a933182ea148d7619d74919cfa68d9df24e30c4d1a6997 WatchSource:0}: Error finding container 90d54eb03885b24698a933182ea148d7619d74919cfa68d9df24e30c4d1a6997: Status 404 returned error can't find the container with id 90d54eb03885b24698a933182ea148d7619d74919cfa68d9df24e30c4d1a6997 Apr 22 17:43:41.642828 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.642801 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:41.693660 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.693636 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v"] Apr 22 17:43:41.696911 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:41.696866 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c0ddd48_55b5_41ef_a1c4_5a535d07a07f.slice/crio-f1c1f08da29574ae2e2d49938f2168b3c22570cf58ee5259529be52615bfe776 WatchSource:0}: Error finding container f1c1f08da29574ae2e2d49938f2168b3c22570cf58ee5259529be52615bfe776: Status 404 returned error can't find the container with id f1c1f08da29574ae2e2d49938f2168b3c22570cf58ee5259529be52615bfe776 Apr 22 17:43:41.781182 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.781157 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh"] Apr 22 17:43:41.788932 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:43:41.788903 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29d49d4a_eee7_469d_bc31_2ebb98badfa5.slice/crio-be89c1cf10f0139983d26ef59070dba94818b18d62889939bb28fd41bc9381ff WatchSource:0}: Error finding container be89c1cf10f0139983d26ef59070dba94818b18d62889939bb28fd41bc9381ff: Status 404 returned error can't find the container with id be89c1cf10f0139983d26ef59070dba94818b18d62889939bb28fd41bc9381ff Apr 22 17:43:41.987632 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.987595 2572 generic.go:358] "Generic (PLEG): container finished" podID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerID="853d0db0d7ed6085c5f9e403a5cc3478667f0421951b6e93124324c151f6b8af" exitCode=0 Apr 22 17:43:41.987812 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.987671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" event={"ID":"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f","Type":"ContainerDied","Data":"853d0db0d7ed6085c5f9e403a5cc3478667f0421951b6e93124324c151f6b8af"} Apr 22 17:43:41.987812 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.987717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" event={"ID":"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f","Type":"ContainerStarted","Data":"f1c1f08da29574ae2e2d49938f2168b3c22570cf58ee5259529be52615bfe776"} Apr 22 17:43:41.989127 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.989077 2572 generic.go:358] "Generic (PLEG): container finished" podID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerID="e7ca8974032a5020c93296f048ff82c2b3189f37136d40bdfde55ff04e28caf8" exitCode=0 Apr 22 17:43:41.989193 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.989155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" event={"ID":"75a9f445-a61c-4ec1-b5cb-19eefbce24fd","Type":"ContainerDied","Data":"e7ca8974032a5020c93296f048ff82c2b3189f37136d40bdfde55ff04e28caf8"} Apr 22 17:43:41.989193 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.989181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" event={"ID":"75a9f445-a61c-4ec1-b5cb-19eefbce24fd","Type":"ContainerStarted","Data":"a0bd5389dca7493cc3651f2a5ecc6961c5027c2c16f952ba7f7259ab7208fa73"} Apr 22 17:43:41.990563 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.990545 2572 generic.go:358] "Generic (PLEG): container finished" podID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerID="a2d5ddd1121821ee10fd455fa416638556a1b585591f78823b6889c2b0f8313d" exitCode=0 Apr 22 17:43:41.990636 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.990612 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" event={"ID":"a7f1601e-5d2f-41c5-b2c7-832099f71969","Type":"ContainerDied","Data":"a2d5ddd1121821ee10fd455fa416638556a1b585591f78823b6889c2b0f8313d"} Apr 22 17:43:41.990636 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.990628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" event={"ID":"a7f1601e-5d2f-41c5-b2c7-832099f71969","Type":"ContainerStarted","Data":"90d54eb03885b24698a933182ea148d7619d74919cfa68d9df24e30c4d1a6997"} Apr 22 17:43:41.991983 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.991962 2572 generic.go:358] "Generic (PLEG): container finished" podID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerID="d16176d69c30e7c72e0d851d4eff1ef26a520c9a68d4f73f661f6140b018baa9" exitCode=0 Apr 22 17:43:41.992072 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.991997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" event={"ID":"29d49d4a-eee7-469d-bc31-2ebb98badfa5","Type":"ContainerDied","Data":"d16176d69c30e7c72e0d851d4eff1ef26a520c9a68d4f73f661f6140b018baa9"} Apr 22 17:43:41.992072 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:41.992013 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" event={"ID":"29d49d4a-eee7-469d-bc31-2ebb98badfa5","Type":"ContainerStarted","Data":"be89c1cf10f0139983d26ef59070dba94818b18d62889939bb28fd41bc9381ff"} Apr 22 17:43:43.002162 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:43.002130 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" event={"ID":"a7f1601e-5d2f-41c5-b2c7-832099f71969","Type":"ContainerStarted","Data":"92ad067dbf05e0c8ddbc53b3e959c2fa82f656568b06859f3d773f12f6afe325"} Apr 22 17:43:43.004025 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:43.003994 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" event={"ID":"29d49d4a-eee7-469d-bc31-2ebb98badfa5","Type":"ContainerStarted","Data":"c3d90fc77ce4d68d5611f5055472a883bdba17e915fd971590465996ee0a03ef"} Apr 22 17:43:43.006096 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:43.006068 2572 generic.go:358] "Generic (PLEG): container finished" podID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerID="eaf1c931ce2fb94458eeb902539bca0d6d3d265e212fb39c551f52aee2555aa1" exitCode=0 Apr 22 17:43:43.006211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:43.006148 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" event={"ID":"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f","Type":"ContainerDied","Data":"eaf1c931ce2fb94458eeb902539bca0d6d3d265e212fb39c551f52aee2555aa1"} Apr 22 17:43:43.008050 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:43.007953 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" event={"ID":"75a9f445-a61c-4ec1-b5cb-19eefbce24fd","Type":"ContainerStarted","Data":"cfbbb7881d837505bb76c1efea7ff278187332cce208b5e2dda1b837f0d3fb03"} Apr 22 17:43:44.013646 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:44.013607 2572 generic.go:358] "Generic (PLEG): container finished" podID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerID="40c2bcff149874af05c0f2a4f3d55febca18804dac1a01588363d0a9594c78ed" exitCode=0 Apr 22 17:43:44.014110 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:44.013690 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" event={"ID":"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f","Type":"ContainerDied","Data":"40c2bcff149874af05c0f2a4f3d55febca18804dac1a01588363d0a9594c78ed"} Apr 22 17:43:44.015206 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:44.015185 2572 generic.go:358] "Generic (PLEG): container finished" podID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerID="cfbbb7881d837505bb76c1efea7ff278187332cce208b5e2dda1b837f0d3fb03" exitCode=0 Apr 22 17:43:44.015303 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:44.015255 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" event={"ID":"75a9f445-a61c-4ec1-b5cb-19eefbce24fd","Type":"ContainerDied","Data":"cfbbb7881d837505bb76c1efea7ff278187332cce208b5e2dda1b837f0d3fb03"} Apr 22 17:43:44.016850 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:44.016826 2572 generic.go:358] "Generic (PLEG): container finished" podID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerID="92ad067dbf05e0c8ddbc53b3e959c2fa82f656568b06859f3d773f12f6afe325" exitCode=0 Apr 22 17:43:44.016962 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:44.016911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" event={"ID":"a7f1601e-5d2f-41c5-b2c7-832099f71969","Type":"ContainerDied","Data":"92ad067dbf05e0c8ddbc53b3e959c2fa82f656568b06859f3d773f12f6afe325"} Apr 22 17:43:44.018664 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:44.018645 2572 generic.go:358] "Generic (PLEG): container finished" podID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerID="c3d90fc77ce4d68d5611f5055472a883bdba17e915fd971590465996ee0a03ef" exitCode=0 Apr 22 17:43:44.018774 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:44.018669 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" event={"ID":"29d49d4a-eee7-469d-bc31-2ebb98badfa5","Type":"ContainerDied","Data":"c3d90fc77ce4d68d5611f5055472a883bdba17e915fd971590465996ee0a03ef"} Apr 22 17:43:45.024789 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.024756 2572 generic.go:358] "Generic (PLEG): container finished" podID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerID="59318cfe7795112e8fc35c2c0060c00af47cbd04f4c0961d4d1ed5962a62389c" exitCode=0 Apr 22 17:43:45.025260 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.024851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" event={"ID":"75a9f445-a61c-4ec1-b5cb-19eefbce24fd","Type":"ContainerDied","Data":"59318cfe7795112e8fc35c2c0060c00af47cbd04f4c0961d4d1ed5962a62389c"} Apr 22 17:43:45.026646 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.026623 2572 generic.go:358] "Generic (PLEG): container finished" podID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerID="60173845996a092eebd217c46ba32f1dc98bdec550c5aabf56fef19e6f3d1836" exitCode=0 Apr 22 17:43:45.026748 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.026717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" event={"ID":"a7f1601e-5d2f-41c5-b2c7-832099f71969","Type":"ContainerDied","Data":"60173845996a092eebd217c46ba32f1dc98bdec550c5aabf56fef19e6f3d1836"} Apr 22 17:43:45.028405 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.028383 2572 generic.go:358] "Generic (PLEG): container finished" podID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerID="bf5e46971b51ef791db6d30dbec42ff5bd9be116f06eab16908fd9391b4e51f8" exitCode=0 Apr 22 17:43:45.028505 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.028465 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" event={"ID":"29d49d4a-eee7-469d-bc31-2ebb98badfa5","Type":"ContainerDied","Data":"bf5e46971b51ef791db6d30dbec42ff5bd9be116f06eab16908fd9391b4e51f8"} Apr 22 17:43:45.154558 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.154531 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:45.199197 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.199164 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-util\") pod \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " Apr 22 17:43:45.199383 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.199210 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-bundle\") pod \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " Apr 22 17:43:45.199383 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.199331 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjk77\" (UniqueName: \"kubernetes.io/projected/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-kube-api-access-pjk77\") pod \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\" (UID: \"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f\") " Apr 22 17:43:45.200264 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.200225 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-bundle" (OuterVolumeSpecName: "bundle") pod "0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" (UID: "0c0ddd48-55b5-41ef-a1c4-5a535d07a07f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:45.201455 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.201428 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-kube-api-access-pjk77" (OuterVolumeSpecName: "kube-api-access-pjk77") pod "0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" (UID: "0c0ddd48-55b5-41ef-a1c4-5a535d07a07f"). InnerVolumeSpecName "kube-api-access-pjk77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:43:45.204453 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.204428 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-util" (OuterVolumeSpecName: "util") pod "0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" (UID: "0c0ddd48-55b5-41ef-a1c4-5a535d07a07f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:45.300791 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.300667 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjk77\" (UniqueName: \"kubernetes.io/projected/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-kube-api-access-pjk77\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:45.300791 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.300734 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:45.300791 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:45.300744 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0ddd48-55b5-41ef-a1c4-5a535d07a07f-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.033579 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.033540 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" event={"ID":"0c0ddd48-55b5-41ef-a1c4-5a535d07a07f","Type":"ContainerDied","Data":"f1c1f08da29574ae2e2d49938f2168b3c22570cf58ee5259529be52615bfe776"} Apr 22 17:43:46.033579 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.033581 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1c1f08da29574ae2e2d49938f2168b3c22570cf58ee5259529be52615bfe776" Apr 22 17:43:46.034015 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.033606 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c306wn9v" Apr 22 17:43:46.190631 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.190575 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:46.190631 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.190613 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:46.196541 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.196516 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:46.197177 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.197158 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:46.200428 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.200411 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:46.203654 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.203639 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:46.309186 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309101 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bjjx\" (UniqueName: \"kubernetes.io/projected/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-kube-api-access-5bjjx\") pod \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " Apr 22 17:43:46.309186 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309150 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-bundle\") pod \"a7f1601e-5d2f-41c5-b2c7-832099f71969\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " Apr 22 17:43:46.309186 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309177 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-util\") pod \"a7f1601e-5d2f-41c5-b2c7-832099f71969\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " Apr 22 17:43:46.309409 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309345 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4mmn\" (UniqueName: \"kubernetes.io/projected/a7f1601e-5d2f-41c5-b2c7-832099f71969-kube-api-access-j4mmn\") pod \"a7f1601e-5d2f-41c5-b2c7-832099f71969\" (UID: \"a7f1601e-5d2f-41c5-b2c7-832099f71969\") " Apr 22 17:43:46.309456 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309412 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-util\") pod \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " Apr 22 17:43:46.309515 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309458 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-util\") pod \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " Apr 22 17:43:46.309515 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309485 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-bundle\") pod \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " Apr 22 17:43:46.309615 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309514 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v79rf\" (UniqueName: \"kubernetes.io/projected/29d49d4a-eee7-469d-bc31-2ebb98badfa5-kube-api-access-v79rf\") pod \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\" (UID: \"29d49d4a-eee7-469d-bc31-2ebb98badfa5\") " Apr 22 17:43:46.309615 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.309542 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-bundle\") pod \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\" (UID: \"75a9f445-a61c-4ec1-b5cb-19eefbce24fd\") " Apr 22 17:43:46.310495 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.310052 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-bundle" (OuterVolumeSpecName: "bundle") pod "a7f1601e-5d2f-41c5-b2c7-832099f71969" (UID: "a7f1601e-5d2f-41c5-b2c7-832099f71969"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:46.310495 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.310271 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-bundle" (OuterVolumeSpecName: "bundle") pod "29d49d4a-eee7-469d-bc31-2ebb98badfa5" (UID: "29d49d4a-eee7-469d-bc31-2ebb98badfa5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:46.311081 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.311042 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-bundle" (OuterVolumeSpecName: "bundle") pod "75a9f445-a61c-4ec1-b5cb-19eefbce24fd" (UID: "75a9f445-a61c-4ec1-b5cb-19eefbce24fd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:46.311901 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.311875 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f1601e-5d2f-41c5-b2c7-832099f71969-kube-api-access-j4mmn" (OuterVolumeSpecName: "kube-api-access-j4mmn") pod "a7f1601e-5d2f-41c5-b2c7-832099f71969" (UID: "a7f1601e-5d2f-41c5-b2c7-832099f71969"). InnerVolumeSpecName "kube-api-access-j4mmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:43:46.312000 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.311915 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-kube-api-access-5bjjx" (OuterVolumeSpecName: "kube-api-access-5bjjx") pod "75a9f445-a61c-4ec1-b5cb-19eefbce24fd" (UID: "75a9f445-a61c-4ec1-b5cb-19eefbce24fd"). InnerVolumeSpecName "kube-api-access-5bjjx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:43:46.312537 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.312515 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d49d4a-eee7-469d-bc31-2ebb98badfa5-kube-api-access-v79rf" (OuterVolumeSpecName: "kube-api-access-v79rf") pod "29d49d4a-eee7-469d-bc31-2ebb98badfa5" (UID: "29d49d4a-eee7-469d-bc31-2ebb98badfa5"). InnerVolumeSpecName "kube-api-access-v79rf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:43:46.315121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.315092 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-util" (OuterVolumeSpecName: "util") pod "29d49d4a-eee7-469d-bc31-2ebb98badfa5" (UID: "29d49d4a-eee7-469d-bc31-2ebb98badfa5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:46.315726 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.315686 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-util" (OuterVolumeSpecName: "util") pod "a7f1601e-5d2f-41c5-b2c7-832099f71969" (UID: "a7f1601e-5d2f-41c5-b2c7-832099f71969"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:46.316744 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.316722 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-util" (OuterVolumeSpecName: "util") pod "75a9f445-a61c-4ec1-b5cb-19eefbce24fd" (UID: "75a9f445-a61c-4ec1-b5cb-19eefbce24fd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:43:46.411183 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411130 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.411183 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411178 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.411183 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411190 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29d49d4a-eee7-469d-bc31-2ebb98badfa5-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.411440 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411203 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v79rf\" (UniqueName: \"kubernetes.io/projected/29d49d4a-eee7-469d-bc31-2ebb98badfa5-kube-api-access-v79rf\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.411440 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411221 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.411440 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411234 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bjjx\" (UniqueName: \"kubernetes.io/projected/75a9f445-a61c-4ec1-b5cb-19eefbce24fd-kube-api-access-5bjjx\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.411440 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411245 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.411440 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411257 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7f1601e-5d2f-41c5-b2c7-832099f71969-util\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:46.411440 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:46.411267 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4mmn\" (UniqueName: \"kubernetes.io/projected/a7f1601e-5d2f-41c5-b2c7-832099f71969-kube-api-access-j4mmn\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:43:47.038854 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.038819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" event={"ID":"75a9f445-a61c-4ec1-b5cb-19eefbce24fd","Type":"ContainerDied","Data":"a0bd5389dca7493cc3651f2a5ecc6961c5027c2c16f952ba7f7259ab7208fa73"} Apr 22 17:43:47.038854 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.038854 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0bd5389dca7493cc3651f2a5ecc6961c5027c2c16f952ba7f7259ab7208fa73" Apr 22 17:43:47.039307 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.038862 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec888gcsw" Apr 22 17:43:47.040511 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.040487 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" event={"ID":"a7f1601e-5d2f-41c5-b2c7-832099f71969","Type":"ContainerDied","Data":"90d54eb03885b24698a933182ea148d7619d74919cfa68d9df24e30c4d1a6997"} Apr 22 17:43:47.040511 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.040512 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90d54eb03885b24698a933182ea148d7619d74919cfa68d9df24e30c4d1a6997" Apr 22 17:43:47.040742 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.040526 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lczsp" Apr 22 17:43:47.042194 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.042172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" event={"ID":"29d49d4a-eee7-469d-bc31-2ebb98badfa5","Type":"ContainerDied","Data":"be89c1cf10f0139983d26ef59070dba94818b18d62889939bb28fd41bc9381ff"} Apr 22 17:43:47.042290 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.042199 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be89c1cf10f0139983d26ef59070dba94818b18d62889939bb28fd41bc9381ff" Apr 22 17:43:47.042290 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.042219 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brfnhh" Apr 22 17:43:47.046302 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.046284 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59c5574b9d-hkr9q" Apr 22 17:43:47.100801 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:43:47.100775 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6456cb9b5f-jtnsm"] Apr 22 17:44:13.064183 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.064122 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6456cb9b5f-jtnsm" podUID="a6ab0422-ac23-4482-a3a5-a71e2c0c8011" containerName="console" containerID="cri-o://60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6" gracePeriod=15 Apr 22 17:44:13.318095 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.318036 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6456cb9b5f-jtnsm_a6ab0422-ac23-4482-a3a5-a71e2c0c8011/console/0.log" Apr 22 17:44:13.318095 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.318094 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:44:13.446830 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.446794 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-serving-cert\") pod \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " Apr 22 17:44:13.447010 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.446843 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-config\") pod \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " Apr 22 17:44:13.447010 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.446904 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-oauth-serving-cert\") pod \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " Apr 22 17:44:13.447010 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.446949 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5wnl\" (UniqueName: \"kubernetes.io/projected/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-kube-api-access-k5wnl\") pod \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " Apr 22 17:44:13.447010 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.446985 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-service-ca\") pod \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " Apr 22 17:44:13.447215 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.447032 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-trusted-ca-bundle\") pod \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " Apr 22 17:44:13.447215 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.447069 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-oauth-config\") pod \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\" (UID: \"a6ab0422-ac23-4482-a3a5-a71e2c0c8011\") " Apr 22 17:44:13.447439 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.447391 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a6ab0422-ac23-4482-a3a5-a71e2c0c8011" (UID: "a6ab0422-ac23-4482-a3a5-a71e2c0c8011"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:44:13.447439 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.447410 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-service-ca" (OuterVolumeSpecName: "service-ca") pod "a6ab0422-ac23-4482-a3a5-a71e2c0c8011" (UID: "a6ab0422-ac23-4482-a3a5-a71e2c0c8011"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:44:13.447584 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.447435 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a6ab0422-ac23-4482-a3a5-a71e2c0c8011" (UID: "a6ab0422-ac23-4482-a3a5-a71e2c0c8011"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:44:13.447584 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.447531 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-config" (OuterVolumeSpecName: "console-config") pod "a6ab0422-ac23-4482-a3a5-a71e2c0c8011" (UID: "a6ab0422-ac23-4482-a3a5-a71e2c0c8011"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:44:13.449117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.449081 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a6ab0422-ac23-4482-a3a5-a71e2c0c8011" (UID: "a6ab0422-ac23-4482-a3a5-a71e2c0c8011"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:44:13.449245 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.449222 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a6ab0422-ac23-4482-a3a5-a71e2c0c8011" (UID: "a6ab0422-ac23-4482-a3a5-a71e2c0c8011"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:44:13.449299 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.449242 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-kube-api-access-k5wnl" (OuterVolumeSpecName: "kube-api-access-k5wnl") pod "a6ab0422-ac23-4482-a3a5-a71e2c0c8011" (UID: "a6ab0422-ac23-4482-a3a5-a71e2c0c8011"). InnerVolumeSpecName "kube-api-access-k5wnl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:44:13.548355 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.548315 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-oauth-serving-cert\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:44:13.548355 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.548349 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5wnl\" (UniqueName: \"kubernetes.io/projected/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-kube-api-access-k5wnl\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:44:13.548355 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.548364 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-service-ca\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:44:13.548610 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.548377 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-trusted-ca-bundle\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:44:13.548610 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.548390 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-oauth-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:44:13.548610 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.548403 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-serving-cert\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:44:13.548610 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:13.548414 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ab0422-ac23-4482-a3a5-a71e2c0c8011-console-config\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:44:14.144450 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.144421 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6456cb9b5f-jtnsm_a6ab0422-ac23-4482-a3a5-a71e2c0c8011/console/0.log" Apr 22 17:44:14.144918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.144467 2572 generic.go:358] "Generic (PLEG): container finished" podID="a6ab0422-ac23-4482-a3a5-a71e2c0c8011" containerID="60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6" exitCode=2 Apr 22 17:44:14.144918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.144525 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6456cb9b5f-jtnsm" Apr 22 17:44:14.144918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.144552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6456cb9b5f-jtnsm" event={"ID":"a6ab0422-ac23-4482-a3a5-a71e2c0c8011","Type":"ContainerDied","Data":"60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6"} Apr 22 17:44:14.144918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.144589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6456cb9b5f-jtnsm" event={"ID":"a6ab0422-ac23-4482-a3a5-a71e2c0c8011","Type":"ContainerDied","Data":"3c32cf7cd1ca5ec3fea2d44eeb53056d97049e719b7f63fc630937907664b67a"} Apr 22 17:44:14.144918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.144607 2572 scope.go:117] "RemoveContainer" containerID="60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6" Apr 22 17:44:14.154230 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.154208 2572 scope.go:117] "RemoveContainer" containerID="60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6" Apr 22 17:44:14.154491 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:44:14.154471 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6\": container with ID starting with 60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6 not found: ID does not exist" containerID="60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6" Apr 22 17:44:14.154554 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.154498 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6"} err="failed to get container status \"60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6\": rpc error: code = NotFound desc = could not find container \"60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6\": container with ID starting with 60a350c12ad6b16113c3149c6592f40815a1bf192df9555ef515b2cb3e8bd2f6 not found: ID does not exist" Apr 22 17:44:14.165792 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.165766 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6456cb9b5f-jtnsm"] Apr 22 17:44:14.168984 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:14.168962 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6456cb9b5f-jtnsm"] Apr 22 17:44:15.098917 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:15.098885 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ab0422-ac23-4482-a3a5-a71e2c0c8011" path="/var/lib/kubelet/pods/a6ab0422-ac23-4482-a3a5-a71e2c0c8011/volumes" Apr 22 17:44:27.020617 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:27.020584 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:44:27.021388 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:27.021370 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:44:51.842882 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.842803 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-drzkg"] Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843168 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerName="pull" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843178 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerName="pull" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843190 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerName="util" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843195 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerName="util" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843201 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerName="util" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843207 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerName="util" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843217 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerName="pull" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843222 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerName="pull" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843229 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerName="pull" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843233 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerName="pull" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843240 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6ab0422-ac23-4482-a3a5-a71e2c0c8011" containerName="console" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843244 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ab0422-ac23-4482-a3a5-a71e2c0c8011" containerName="console" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843250 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerName="util" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843255 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerName="util" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843262 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843266 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843271 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843276 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843284 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843289 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843294 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843299 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843306 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerName="util" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843311 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerName="util" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843317 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerName="pull" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843322 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerName="pull" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843377 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="75a9f445-a61c-4ec1-b5cb-19eefbce24fd" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843386 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="29d49d4a-eee7-469d-bc31-2ebb98badfa5" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843392 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c0ddd48-55b5-41ef-a1c4-5a535d07a07f" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843399 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7f1601e-5d2f-41c5-b2c7-832099f71969" containerName="extract" Apr 22 17:44:51.845244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.843406 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6ab0422-ac23-4482-a3a5-a71e2c0c8011" containerName="console" Apr 22 17:44:51.846277 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.846261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:51.848340 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.848310 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 17:44:51.848987 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.848969 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 17:44:51.849105 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.848986 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-64xhv\"" Apr 22 17:44:51.849105 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.848998 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 17:44:51.852895 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.852878 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-drzkg"] Apr 22 17:44:51.879281 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.879250 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-drzkg"] Apr 22 17:44:51.983189 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.983148 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/08478191-02d1-4e5b-bb2b-84c1fa06481d-config-file\") pod \"limitador-limitador-67566c68b4-drzkg\" (UID: \"08478191-02d1-4e5b-bb2b-84c1fa06481d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:51.983355 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:51.983284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmb6l\" (UniqueName: \"kubernetes.io/projected/08478191-02d1-4e5b-bb2b-84c1fa06481d-kube-api-access-cmb6l\") pod \"limitador-limitador-67566c68b4-drzkg\" (UID: \"08478191-02d1-4e5b-bb2b-84c1fa06481d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:52.084141 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.084106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmb6l\" (UniqueName: \"kubernetes.io/projected/08478191-02d1-4e5b-bb2b-84c1fa06481d-kube-api-access-cmb6l\") pod \"limitador-limitador-67566c68b4-drzkg\" (UID: \"08478191-02d1-4e5b-bb2b-84c1fa06481d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:52.084318 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.084156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/08478191-02d1-4e5b-bb2b-84c1fa06481d-config-file\") pod \"limitador-limitador-67566c68b4-drzkg\" (UID: \"08478191-02d1-4e5b-bb2b-84c1fa06481d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:52.084729 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.084673 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/08478191-02d1-4e5b-bb2b-84c1fa06481d-config-file\") pod \"limitador-limitador-67566c68b4-drzkg\" (UID: \"08478191-02d1-4e5b-bb2b-84c1fa06481d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:52.091474 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.091447 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmb6l\" (UniqueName: \"kubernetes.io/projected/08478191-02d1-4e5b-bb2b-84c1fa06481d-kube-api-access-cmb6l\") pod \"limitador-limitador-67566c68b4-drzkg\" (UID: \"08478191-02d1-4e5b-bb2b-84c1fa06481d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:52.157423 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.157386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:52.249863 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.249832 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-9bhk5"] Apr 22 17:44:52.254216 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.254193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-9bhk5" Apr 22 17:44:52.257276 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.257068 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-xp2rz\"" Apr 22 17:44:52.261583 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.261537 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-9bhk5"] Apr 22 17:44:52.286211 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.286189 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-drzkg"] Apr 22 17:44:52.288840 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:44:52.288812 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08478191_02d1_4e5b_bb2b_84c1fa06481d.slice/crio-a2538d1780afeb3411be61f8e3ba9becaa853f0e29320aac8bcf11056e137c49 WatchSource:0}: Error finding container a2538d1780afeb3411be61f8e3ba9becaa853f0e29320aac8bcf11056e137c49: Status 404 returned error can't find the container with id a2538d1780afeb3411be61f8e3ba9becaa853f0e29320aac8bcf11056e137c49 Apr 22 17:44:52.290808 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.290793 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:44:52.386636 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.386604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fntz\" (UniqueName: \"kubernetes.io/projected/fed1266f-f14f-45bc-9f91-840f80f3de56-kube-api-access-8fntz\") pod \"authorino-674b59b84c-9bhk5\" (UID: \"fed1266f-f14f-45bc-9f91-840f80f3de56\") " pod="kuadrant-system/authorino-674b59b84c-9bhk5" Apr 22 17:44:52.487180 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.487092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fntz\" (UniqueName: \"kubernetes.io/projected/fed1266f-f14f-45bc-9f91-840f80f3de56-kube-api-access-8fntz\") pod \"authorino-674b59b84c-9bhk5\" (UID: \"fed1266f-f14f-45bc-9f91-840f80f3de56\") " pod="kuadrant-system/authorino-674b59b84c-9bhk5" Apr 22 17:44:52.494294 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.494264 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fntz\" (UniqueName: \"kubernetes.io/projected/fed1266f-f14f-45bc-9f91-840f80f3de56-kube-api-access-8fntz\") pod \"authorino-674b59b84c-9bhk5\" (UID: \"fed1266f-f14f-45bc-9f91-840f80f3de56\") " pod="kuadrant-system/authorino-674b59b84c-9bhk5" Apr 22 17:44:52.566177 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.566146 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-9bhk5" Apr 22 17:44:52.687057 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:52.687030 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-9bhk5"] Apr 22 17:44:52.688165 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:44:52.688139 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed1266f_f14f_45bc_9f91_840f80f3de56.slice/crio-7933616856b5355574cbe2131ad0cad58cd1ac148021c18a397fb7533f29a37d WatchSource:0}: Error finding container 7933616856b5355574cbe2131ad0cad58cd1ac148021c18a397fb7533f29a37d: Status 404 returned error can't find the container with id 7933616856b5355574cbe2131ad0cad58cd1ac148021c18a397fb7533f29a37d Apr 22 17:44:53.295729 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:53.295662 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" event={"ID":"08478191-02d1-4e5b-bb2b-84c1fa06481d","Type":"ContainerStarted","Data":"a2538d1780afeb3411be61f8e3ba9becaa853f0e29320aac8bcf11056e137c49"} Apr 22 17:44:53.297061 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:53.297030 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-9bhk5" event={"ID":"fed1266f-f14f-45bc-9f91-840f80f3de56","Type":"ContainerStarted","Data":"7933616856b5355574cbe2131ad0cad58cd1ac148021c18a397fb7533f29a37d"} Apr 22 17:44:55.996896 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:55.996857 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-9bhk5"] Apr 22 17:44:58.326163 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.326126 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-9bhk5" podUID="fed1266f-f14f-45bc-9f91-840f80f3de56" containerName="authorino" containerID="cri-o://2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8" gracePeriod=30 Apr 22 17:44:58.326619 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.326124 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-9bhk5" event={"ID":"fed1266f-f14f-45bc-9f91-840f80f3de56","Type":"ContainerStarted","Data":"2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8"} Apr 22 17:44:58.327511 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.327482 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" event={"ID":"08478191-02d1-4e5b-bb2b-84c1fa06481d","Type":"ContainerStarted","Data":"66399ac5d7fb4b2f5bd588b3e7af3212ccc8bd01caec5eeb4792530a8547e92c"} Apr 22 17:44:58.327647 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.327634 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:44:58.340058 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.339980 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-9bhk5" podStartSLOduration=1.646165636 podStartE2EDuration="6.339963341s" podCreationTimestamp="2026-04-22 17:44:52 +0000 UTC" firstStartedPulling="2026-04-22 17:44:52.6894442 +0000 UTC m=+626.168700860" lastFinishedPulling="2026-04-22 17:44:57.383241901 +0000 UTC m=+630.862498565" observedRunningTime="2026-04-22 17:44:58.339247561 +0000 UTC m=+631.818504242" watchObservedRunningTime="2026-04-22 17:44:58.339963341 +0000 UTC m=+631.819220024" Apr 22 17:44:58.353988 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.353933 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" podStartSLOduration=2.32049093 podStartE2EDuration="7.353916006s" podCreationTimestamp="2026-04-22 17:44:51 +0000 UTC" firstStartedPulling="2026-04-22 17:44:52.290919037 +0000 UTC m=+625.770175698" lastFinishedPulling="2026-04-22 17:44:57.324344114 +0000 UTC m=+630.803600774" observedRunningTime="2026-04-22 17:44:58.353107135 +0000 UTC m=+631.832363816" watchObservedRunningTime="2026-04-22 17:44:58.353916006 +0000 UTC m=+631.833172688" Apr 22 17:44:58.574715 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.574684 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-9bhk5" Apr 22 17:44:58.751922 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.751885 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fntz\" (UniqueName: \"kubernetes.io/projected/fed1266f-f14f-45bc-9f91-840f80f3de56-kube-api-access-8fntz\") pod \"fed1266f-f14f-45bc-9f91-840f80f3de56\" (UID: \"fed1266f-f14f-45bc-9f91-840f80f3de56\") " Apr 22 17:44:58.754088 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.754057 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed1266f-f14f-45bc-9f91-840f80f3de56-kube-api-access-8fntz" (OuterVolumeSpecName: "kube-api-access-8fntz") pod "fed1266f-f14f-45bc-9f91-840f80f3de56" (UID: "fed1266f-f14f-45bc-9f91-840f80f3de56"). InnerVolumeSpecName "kube-api-access-8fntz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:44:58.852436 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:58.852403 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8fntz\" (UniqueName: \"kubernetes.io/projected/fed1266f-f14f-45bc-9f91-840f80f3de56-kube-api-access-8fntz\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:44:59.332457 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.332415 2572 generic.go:358] "Generic (PLEG): container finished" podID="fed1266f-f14f-45bc-9f91-840f80f3de56" containerID="2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8" exitCode=0 Apr 22 17:44:59.332971 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.332479 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-9bhk5" Apr 22 17:44:59.332971 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.332505 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-9bhk5" event={"ID":"fed1266f-f14f-45bc-9f91-840f80f3de56","Type":"ContainerDied","Data":"2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8"} Apr 22 17:44:59.332971 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.332542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-9bhk5" event={"ID":"fed1266f-f14f-45bc-9f91-840f80f3de56","Type":"ContainerDied","Data":"7933616856b5355574cbe2131ad0cad58cd1ac148021c18a397fb7533f29a37d"} Apr 22 17:44:59.332971 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.332561 2572 scope.go:117] "RemoveContainer" containerID="2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8" Apr 22 17:44:59.341209 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.341191 2572 scope.go:117] "RemoveContainer" containerID="2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8" Apr 22 17:44:59.341487 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:44:59.341467 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8\": container with ID starting with 2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8 not found: ID does not exist" containerID="2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8" Apr 22 17:44:59.341542 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.341497 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8"} err="failed to get container status \"2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8\": rpc error: code = NotFound desc = could not find container \"2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8\": container with ID starting with 2561dd1d3a18b4392917e5b367391c7f949b793a40572c26d72ea8c3a7999ae8 not found: ID does not exist" Apr 22 17:44:59.349464 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.349435 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-9bhk5"] Apr 22 17:44:59.355026 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:44:59.354997 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-9bhk5"] Apr 22 17:45:01.098818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:45:01.098782 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed1266f-f14f-45bc-9f91-840f80f3de56" path="/var/lib/kubelet/pods/fed1266f-f14f-45bc-9f91-840f80f3de56/volumes" Apr 22 17:45:09.334232 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:45:09.334200 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-drzkg" Apr 22 17:47:03.552044 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.552009 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-hltgs"] Apr 22 17:47:03.552521 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.552388 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fed1266f-f14f-45bc-9f91-840f80f3de56" containerName="authorino" Apr 22 17:47:03.552521 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.552400 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed1266f-f14f-45bc-9f91-840f80f3de56" containerName="authorino" Apr 22 17:47:03.552521 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.552458 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fed1266f-f14f-45bc-9f91-840f80f3de56" containerName="authorino" Apr 22 17:47:03.555365 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.555347 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hltgs" Apr 22 17:47:03.562205 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.562180 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 17:47:03.562721 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.562499 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-27gmk\"" Apr 22 17:47:03.563207 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.563186 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 17:47:03.563207 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.563202 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 17:47:03.566672 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.566648 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hltgs"] Apr 22 17:47:03.614863 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.614837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q28wx\" (UniqueName: \"kubernetes.io/projected/ee49a464-d38e-4463-ba52-0ab26dafd3c7-kube-api-access-q28wx\") pod \"s3-init-hltgs\" (UID: \"ee49a464-d38e-4463-ba52-0ab26dafd3c7\") " pod="kserve/s3-init-hltgs" Apr 22 17:47:03.715634 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.715597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q28wx\" (UniqueName: \"kubernetes.io/projected/ee49a464-d38e-4463-ba52-0ab26dafd3c7-kube-api-access-q28wx\") pod \"s3-init-hltgs\" (UID: \"ee49a464-d38e-4463-ba52-0ab26dafd3c7\") " pod="kserve/s3-init-hltgs" Apr 22 17:47:03.725079 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.725046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q28wx\" (UniqueName: \"kubernetes.io/projected/ee49a464-d38e-4463-ba52-0ab26dafd3c7-kube-api-access-q28wx\") pod \"s3-init-hltgs\" (UID: \"ee49a464-d38e-4463-ba52-0ab26dafd3c7\") " pod="kserve/s3-init-hltgs" Apr 22 17:47:03.870244 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.870169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hltgs" Apr 22 17:47:03.991268 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:03.991242 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hltgs"] Apr 22 17:47:03.992960 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:47:03.992922 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee49a464_d38e_4463_ba52_0ab26dafd3c7.slice/crio-30849a267c5988f2f926736199d58b6572f6436cd8b59cfd16fba013b90285af WatchSource:0}: Error finding container 30849a267c5988f2f926736199d58b6572f6436cd8b59cfd16fba013b90285af: Status 404 returned error can't find the container with id 30849a267c5988f2f926736199d58b6572f6436cd8b59cfd16fba013b90285af Apr 22 17:47:04.811549 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:04.811505 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hltgs" event={"ID":"ee49a464-d38e-4463-ba52-0ab26dafd3c7","Type":"ContainerStarted","Data":"30849a267c5988f2f926736199d58b6572f6436cd8b59cfd16fba013b90285af"} Apr 22 17:47:08.830411 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:08.830320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hltgs" event={"ID":"ee49a464-d38e-4463-ba52-0ab26dafd3c7","Type":"ContainerStarted","Data":"f7e39f9d5f3d8f0fd28c2a560e70ad1682ae9fa3aebba953727a8060e2baa90e"} Apr 22 17:47:08.848116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:08.848065 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-hltgs" podStartSLOduration=1.401830177 podStartE2EDuration="5.848049038s" podCreationTimestamp="2026-04-22 17:47:03 +0000 UTC" firstStartedPulling="2026-04-22 17:47:03.994868431 +0000 UTC m=+757.474125091" lastFinishedPulling="2026-04-22 17:47:08.441087289 +0000 UTC m=+761.920343952" observedRunningTime="2026-04-22 17:47:08.846466842 +0000 UTC m=+762.325723522" watchObservedRunningTime="2026-04-22 17:47:08.848049038 +0000 UTC m=+762.327305754" Apr 22 17:47:11.844307 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:11.844220 2572 generic.go:358] "Generic (PLEG): container finished" podID="ee49a464-d38e-4463-ba52-0ab26dafd3c7" containerID="f7e39f9d5f3d8f0fd28c2a560e70ad1682ae9fa3aebba953727a8060e2baa90e" exitCode=0 Apr 22 17:47:11.844307 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:11.844268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hltgs" event={"ID":"ee49a464-d38e-4463-ba52-0ab26dafd3c7","Type":"ContainerDied","Data":"f7e39f9d5f3d8f0fd28c2a560e70ad1682ae9fa3aebba953727a8060e2baa90e"} Apr 22 17:47:12.978013 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:12.977991 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hltgs" Apr 22 17:47:13.000965 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:13.000936 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q28wx\" (UniqueName: \"kubernetes.io/projected/ee49a464-d38e-4463-ba52-0ab26dafd3c7-kube-api-access-q28wx\") pod \"ee49a464-d38e-4463-ba52-0ab26dafd3c7\" (UID: \"ee49a464-d38e-4463-ba52-0ab26dafd3c7\") " Apr 22 17:47:13.003194 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:13.003160 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee49a464-d38e-4463-ba52-0ab26dafd3c7-kube-api-access-q28wx" (OuterVolumeSpecName: "kube-api-access-q28wx") pod "ee49a464-d38e-4463-ba52-0ab26dafd3c7" (UID: "ee49a464-d38e-4463-ba52-0ab26dafd3c7"). InnerVolumeSpecName "kube-api-access-q28wx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:47:13.102585 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:13.102489 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q28wx\" (UniqueName: \"kubernetes.io/projected/ee49a464-d38e-4463-ba52-0ab26dafd3c7-kube-api-access-q28wx\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:47:13.851966 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:13.851936 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hltgs" Apr 22 17:47:13.852145 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:13.851943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hltgs" event={"ID":"ee49a464-d38e-4463-ba52-0ab26dafd3c7","Type":"ContainerDied","Data":"30849a267c5988f2f926736199d58b6572f6436cd8b59cfd16fba013b90285af"} Apr 22 17:47:13.852145 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:13.852040 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30849a267c5988f2f926736199d58b6572f6436cd8b59cfd16fba013b90285af" Apr 22 17:47:36.896165 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.896130 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9"] Apr 22 17:47:36.896673 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.896504 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee49a464-d38e-4463-ba52-0ab26dafd3c7" containerName="s3-init" Apr 22 17:47:36.896673 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.896515 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee49a464-d38e-4463-ba52-0ab26dafd3c7" containerName="s3-init" Apr 22 17:47:36.896673 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.896593 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee49a464-d38e-4463-ba52-0ab26dafd3c7" containerName="s3-init" Apr 22 17:47:36.931403 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.931373 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9"] Apr 22 17:47:36.931565 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.931499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:36.933760 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.933732 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:47:36.933975 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.933944 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 17:47:36.936741 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.934638 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 17:47:36.936741 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:36.934991 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 17:47:37.006598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.006551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c62e03-f5ef-426c-b174-9f0026e47463-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.006598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.006595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.006870 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.006622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.006870 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.006638 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm7qt\" (UniqueName: \"kubernetes.io/projected/f2c62e03-f5ef-426c-b174-9f0026e47463-kube-api-access-lm7qt\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.006870 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.006683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-home\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.006870 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.006767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-dshm\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.107902 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.107874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c62e03-f5ef-426c-b174-9f0026e47463-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.107902 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.107905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.108121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.107930 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.108121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.108054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm7qt\" (UniqueName: \"kubernetes.io/projected/f2c62e03-f5ef-426c-b174-9f0026e47463-kube-api-access-lm7qt\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.108121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.108111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-home\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.108231 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.108175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-dshm\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.108340 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.108320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.108411 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.108344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.108535 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.108513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-home\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.110285 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.110264 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-dshm\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.110423 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.110406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c62e03-f5ef-426c-b174-9f0026e47463-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.117226 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.117205 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm7qt\" (UniqueName: \"kubernetes.io/projected/f2c62e03-f5ef-426c-b174-9f0026e47463-kube-api-access-lm7qt\") pod \"scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.245241 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.245140 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:37.383209 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.383185 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9"] Apr 22 17:47:37.385011 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:47:37.384982 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c62e03_f5ef_426c_b174_9f0026e47463.slice/crio-492e2646085f183e46342a66b3d99d3fd931350f7f12c6d172b2daf1de66dce3 WatchSource:0}: Error finding container 492e2646085f183e46342a66b3d99d3fd931350f7f12c6d172b2daf1de66dce3: Status 404 returned error can't find the container with id 492e2646085f183e46342a66b3d99d3fd931350f7f12c6d172b2daf1de66dce3 Apr 22 17:47:37.947208 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:37.947173 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" event={"ID":"f2c62e03-f5ef-426c-b174-9f0026e47463","Type":"ContainerStarted","Data":"492e2646085f183e46342a66b3d99d3fd931350f7f12c6d172b2daf1de66dce3"} Apr 22 17:47:41.965850 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:41.965761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" event={"ID":"f2c62e03-f5ef-426c-b174-9f0026e47463","Type":"ContainerStarted","Data":"abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123"} Apr 22 17:47:45.985499 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:45.985461 2572 generic.go:358] "Generic (PLEG): container finished" podID="f2c62e03-f5ef-426c-b174-9f0026e47463" containerID="abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123" exitCode=0 Apr 22 17:47:45.985983 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:45.985536 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" event={"ID":"f2c62e03-f5ef-426c-b174-9f0026e47463","Type":"ContainerDied","Data":"abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123"} Apr 22 17:47:47.995449 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:47.995410 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" event={"ID":"f2c62e03-f5ef-426c-b174-9f0026e47463","Type":"ContainerStarted","Data":"77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73"} Apr 22 17:47:48.017271 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:48.017219 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" podStartSLOduration=2.261365378 podStartE2EDuration="12.017203321s" podCreationTimestamp="2026-04-22 17:47:36 +0000 UTC" firstStartedPulling="2026-04-22 17:47:37.386931756 +0000 UTC m=+790.866188431" lastFinishedPulling="2026-04-22 17:47:47.14276971 +0000 UTC m=+800.622026374" observedRunningTime="2026-04-22 17:47:48.015485784 +0000 UTC m=+801.494742465" watchObservedRunningTime="2026-04-22 17:47:48.017203321 +0000 UTC m=+801.496460002" Apr 22 17:47:57.246073 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:57.246030 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:57.246073 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:57.246078 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:57.258828 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:57.258793 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:47:58.045479 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:47:58.045446 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:48:28.804750 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:28.804716 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9"] Apr 22 17:48:28.805303 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:28.804987 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" podUID="f2c62e03-f5ef-426c-b174-9f0026e47463" containerName="main" containerID="cri-o://77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73" gracePeriod=30 Apr 22 17:48:29.047806 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.047784 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:48:29.156435 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.156405 2572 generic.go:358] "Generic (PLEG): container finished" podID="f2c62e03-f5ef-426c-b174-9f0026e47463" containerID="77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73" exitCode=0 Apr 22 17:48:29.156598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.156467 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" Apr 22 17:48:29.156598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.156482 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" event={"ID":"f2c62e03-f5ef-426c-b174-9f0026e47463","Type":"ContainerDied","Data":"77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73"} Apr 22 17:48:29.156598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.156521 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9" event={"ID":"f2c62e03-f5ef-426c-b174-9f0026e47463","Type":"ContainerDied","Data":"492e2646085f183e46342a66b3d99d3fd931350f7f12c6d172b2daf1de66dce3"} Apr 22 17:48:29.156598 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.156540 2572 scope.go:117] "RemoveContainer" containerID="77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73" Apr 22 17:48:29.164827 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.164804 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-home\") pod \"f2c62e03-f5ef-426c-b174-9f0026e47463\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " Apr 22 17:48:29.164961 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.164847 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-model-cache\") pod \"f2c62e03-f5ef-426c-b174-9f0026e47463\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " Apr 22 17:48:29.164961 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.164882 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm7qt\" (UniqueName: \"kubernetes.io/projected/f2c62e03-f5ef-426c-b174-9f0026e47463-kube-api-access-lm7qt\") pod \"f2c62e03-f5ef-426c-b174-9f0026e47463\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " Apr 22 17:48:29.164961 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.164896 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-dshm\") pod \"f2c62e03-f5ef-426c-b174-9f0026e47463\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " Apr 22 17:48:29.165320 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.165269 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-home" (OuterVolumeSpecName: "home") pod "f2c62e03-f5ef-426c-b174-9f0026e47463" (UID: "f2c62e03-f5ef-426c-b174-9f0026e47463"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:48:29.165506 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.165476 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-model-cache" (OuterVolumeSpecName: "model-cache") pod "f2c62e03-f5ef-426c-b174-9f0026e47463" (UID: "f2c62e03-f5ef-426c-b174-9f0026e47463"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:48:29.165632 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.165526 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-kserve-provision-location\") pod \"f2c62e03-f5ef-426c-b174-9f0026e47463\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " Apr 22 17:48:29.165632 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.165603 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c62e03-f5ef-426c-b174-9f0026e47463-tls-certs\") pod \"f2c62e03-f5ef-426c-b174-9f0026e47463\" (UID: \"f2c62e03-f5ef-426c-b174-9f0026e47463\") " Apr 22 17:48:29.166249 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.166194 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:48:29.166249 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.166221 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:48:29.173826 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.173799 2572 scope.go:117] "RemoveContainer" containerID="abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123" Apr 22 17:48:29.174655 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.174632 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-dshm" (OuterVolumeSpecName: "dshm") pod "f2c62e03-f5ef-426c-b174-9f0026e47463" (UID: "f2c62e03-f5ef-426c-b174-9f0026e47463"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:48:29.174655 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.174644 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c62e03-f5ef-426c-b174-9f0026e47463-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f2c62e03-f5ef-426c-b174-9f0026e47463" (UID: "f2c62e03-f5ef-426c-b174-9f0026e47463"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:48:29.174831 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.174666 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c62e03-f5ef-426c-b174-9f0026e47463-kube-api-access-lm7qt" (OuterVolumeSpecName: "kube-api-access-lm7qt") pod "f2c62e03-f5ef-426c-b174-9f0026e47463" (UID: "f2c62e03-f5ef-426c-b174-9f0026e47463"). InnerVolumeSpecName "kube-api-access-lm7qt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:48:29.228602 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.228554 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f2c62e03-f5ef-426c-b174-9f0026e47463" (UID: "f2c62e03-f5ef-426c-b174-9f0026e47463"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:48:29.243992 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.243966 2572 scope.go:117] "RemoveContainer" containerID="77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73" Apr 22 17:48:29.244333 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:48:29.244307 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73\": container with ID starting with 77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73 not found: ID does not exist" containerID="77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73" Apr 22 17:48:29.244382 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.244343 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73"} err="failed to get container status \"77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73\": rpc error: code = NotFound desc = could not find container \"77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73\": container with ID starting with 77c245feab5708a67bacc23a0e3f640403f6afdd17d8c3d08980355d8b253b73 not found: ID does not exist" Apr 22 17:48:29.244382 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.244365 2572 scope.go:117] "RemoveContainer" containerID="abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123" Apr 22 17:48:29.244626 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:48:29.244606 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123\": container with ID starting with abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123 not found: ID does not exist" containerID="abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123" Apr 22 17:48:29.244663 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.244630 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123"} err="failed to get container status \"abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123\": rpc error: code = NotFound desc = could not find container \"abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123\": container with ID starting with abaec842aa3909e777631ffec7301b6f161312452c8fe54a13afdac043d0d123 not found: ID does not exist" Apr 22 17:48:29.266797 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.266771 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lm7qt\" (UniqueName: \"kubernetes.io/projected/f2c62e03-f5ef-426c-b174-9f0026e47463-kube-api-access-lm7qt\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:48:29.266797 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.266797 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:48:29.266934 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.266809 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c62e03-f5ef-426c-b174-9f0026e47463-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:48:29.266934 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.266819 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c62e03-f5ef-426c-b174-9f0026e47463-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:48:29.485268 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.485236 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9"] Apr 22 17:48:29.490273 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:29.490249 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5bb988c864-kcmk9"] Apr 22 17:48:31.100411 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:31.100377 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2c62e03-f5ef-426c-b174-9f0026e47463" path="/var/lib/kubelet/pods/f2c62e03-f5ef-426c-b174-9f0026e47463/volumes" Apr 22 17:48:35.804142 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.804111 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7"] Apr 22 17:48:35.804544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.804528 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c62e03-f5ef-426c-b174-9f0026e47463" containerName="storage-initializer" Apr 22 17:48:35.804593 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.804545 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c62e03-f5ef-426c-b174-9f0026e47463" containerName="storage-initializer" Apr 22 17:48:35.804593 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.804558 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c62e03-f5ef-426c-b174-9f0026e47463" containerName="main" Apr 22 17:48:35.804593 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.804564 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c62e03-f5ef-426c-b174-9f0026e47463" containerName="main" Apr 22 17:48:35.804685 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.804627 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2c62e03-f5ef-426c-b174-9f0026e47463" containerName="main" Apr 22 17:48:35.809635 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.809614 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:35.811858 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.811838 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 17:48:35.812710 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.812680 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 17:48:35.812803 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.812743 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:48:35.812803 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.812742 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 17:48:35.821234 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.821201 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7"] Apr 22 17:48:35.923061 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.923025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:35.923254 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.923122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e242-9292-4645-9bb7-40807db10b7f-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:35.923254 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.923146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njqp7\" (UniqueName: \"kubernetes.io/projected/c021e242-9292-4645-9bb7-40807db10b7f-kube-api-access-njqp7\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:35.923254 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.923180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-dshm\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:35.923254 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.923209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-model-cache\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:35.923254 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:35.923235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-home\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.023960 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.023922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.024156 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.024009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e242-9292-4645-9bb7-40807db10b7f-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.024156 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.024031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njqp7\" (UniqueName: \"kubernetes.io/projected/c021e242-9292-4645-9bb7-40807db10b7f-kube-api-access-njqp7\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.024156 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.024052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-dshm\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.024156 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.024071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-model-cache\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.024156 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.024109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-home\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.024443 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.024411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.024492 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.024441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-home\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.024602 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.024578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-model-cache\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.026403 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.026377 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-dshm\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.026662 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.026642 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e242-9292-4645-9bb7-40807db10b7f-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.040546 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.040524 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njqp7\" (UniqueName: \"kubernetes.io/projected/c021e242-9292-4645-9bb7-40807db10b7f-kube-api-access-njqp7\") pod \"scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.122449 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.122358 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:36.273079 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:36.273046 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7"] Apr 22 17:48:36.274040 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:48:36.274009 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc021e242_9292_4645_9bb7_40807db10b7f.slice/crio-8e17328c5e29e794d529c603b1c9310714b3a8954e8ac576242ef73c494414f8 WatchSource:0}: Error finding container 8e17328c5e29e794d529c603b1c9310714b3a8954e8ac576242ef73c494414f8: Status 404 returned error can't find the container with id 8e17328c5e29e794d529c603b1c9310714b3a8954e8ac576242ef73c494414f8 Apr 22 17:48:37.194615 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:37.194572 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" event={"ID":"c021e242-9292-4645-9bb7-40807db10b7f","Type":"ContainerStarted","Data":"1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c"} Apr 22 17:48:37.195001 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:37.194622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" event={"ID":"c021e242-9292-4645-9bb7-40807db10b7f","Type":"ContainerStarted","Data":"8e17328c5e29e794d529c603b1c9310714b3a8954e8ac576242ef73c494414f8"} Apr 22 17:48:41.211371 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:41.211336 2572 generic.go:358] "Generic (PLEG): container finished" podID="c021e242-9292-4645-9bb7-40807db10b7f" containerID="1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c" exitCode=0 Apr 22 17:48:41.211789 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:41.211411 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" event={"ID":"c021e242-9292-4645-9bb7-40807db10b7f","Type":"ContainerDied","Data":"1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c"} Apr 22 17:48:42.217159 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:42.217126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" event={"ID":"c021e242-9292-4645-9bb7-40807db10b7f","Type":"ContainerStarted","Data":"7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb"} Apr 22 17:48:42.236121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:42.236073 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" podStartSLOduration=7.236058755 podStartE2EDuration="7.236058755s" podCreationTimestamp="2026-04-22 17:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:48:42.234248955 +0000 UTC m=+855.713505636" watchObservedRunningTime="2026-04-22 17:48:42.236058755 +0000 UTC m=+855.715315437" Apr 22 17:48:46.122972 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:46.122933 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:46.122972 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:46.122973 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:46.135535 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:46.135507 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:48:46.244438 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:48:46.244410 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:49:22.104662 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.104578 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj"] Apr 22 17:49:22.108843 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.108817 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.111837 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.111817 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 17:49:22.125381 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.125355 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj"] Apr 22 17:49:22.232882 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.232850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-home\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.233039 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.232905 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgkm\" (UniqueName: \"kubernetes.io/projected/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kube-api-access-lpgkm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.233039 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.232968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7dfc2f-4352-40f9-9cd0-4e376997393b-tls-certs\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.233039 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.233012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.233144 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.233062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-dshm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.233183 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.233149 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-model-cache\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.333987 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.333943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgkm\" (UniqueName: \"kubernetes.io/projected/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kube-api-access-lpgkm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.333987 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.333987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7dfc2f-4352-40f9-9cd0-4e376997393b-tls-certs\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.334250 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.334005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.334250 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.334036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-dshm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.334250 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.334106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-model-cache\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.334250 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.334142 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-home\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.334517 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.334494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.334593 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.334575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-model-cache\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.334647 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.334595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-home\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.336405 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.336387 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-dshm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.336600 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.336582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7dfc2f-4352-40f9-9cd0-4e376997393b-tls-certs\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.350361 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.350330 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgkm\" (UniqueName: \"kubernetes.io/projected/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kube-api-access-lpgkm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.437863 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.437825 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:22.785366 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:22.785340 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj"] Apr 22 17:49:22.786833 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:49:22.786807 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e7dfc2f_4352_40f9_9cd0_4e376997393b.slice/crio-71037168fb9bf79dcd3e3744ab0debfa88bddf82916b78942fecf86f9b4c04f6 WatchSource:0}: Error finding container 71037168fb9bf79dcd3e3744ab0debfa88bddf82916b78942fecf86f9b4c04f6: Status 404 returned error can't find the container with id 71037168fb9bf79dcd3e3744ab0debfa88bddf82916b78942fecf86f9b4c04f6 Apr 22 17:49:23.383879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:23.383839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" event={"ID":"5e7dfc2f-4352-40f9-9cd0-4e376997393b","Type":"ContainerStarted","Data":"20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d"} Apr 22 17:49:23.383879 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:23.383882 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" event={"ID":"5e7dfc2f-4352-40f9-9cd0-4e376997393b","Type":"ContainerStarted","Data":"71037168fb9bf79dcd3e3744ab0debfa88bddf82916b78942fecf86f9b4c04f6"} Apr 22 17:49:23.808418 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:23.808378 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7"] Apr 22 17:49:23.809068 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:23.809006 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" podUID="c021e242-9292-4645-9bb7-40807db10b7f" containerName="main" containerID="cri-o://7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb" gracePeriod=30 Apr 22 17:49:24.066177 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.066105 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:49:24.151907 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.151871 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-dshm\") pod \"c021e242-9292-4645-9bb7-40807db10b7f\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " Apr 22 17:49:24.152120 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.151924 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e242-9292-4645-9bb7-40807db10b7f-tls-certs\") pod \"c021e242-9292-4645-9bb7-40807db10b7f\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " Apr 22 17:49:24.152120 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.151989 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-kserve-provision-location\") pod \"c021e242-9292-4645-9bb7-40807db10b7f\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " Apr 22 17:49:24.152120 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.152043 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-home\") pod \"c021e242-9292-4645-9bb7-40807db10b7f\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " Apr 22 17:49:24.152120 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.152069 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-model-cache\") pod \"c021e242-9292-4645-9bb7-40807db10b7f\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " Apr 22 17:49:24.152120 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.152099 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njqp7\" (UniqueName: \"kubernetes.io/projected/c021e242-9292-4645-9bb7-40807db10b7f-kube-api-access-njqp7\") pod \"c021e242-9292-4645-9bb7-40807db10b7f\" (UID: \"c021e242-9292-4645-9bb7-40807db10b7f\") " Apr 22 17:49:24.154420 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.152519 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-home" (OuterVolumeSpecName: "home") pod "c021e242-9292-4645-9bb7-40807db10b7f" (UID: "c021e242-9292-4645-9bb7-40807db10b7f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:24.154420 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.152869 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-model-cache" (OuterVolumeSpecName: "model-cache") pod "c021e242-9292-4645-9bb7-40807db10b7f" (UID: "c021e242-9292-4645-9bb7-40807db10b7f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:24.155570 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.155428 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-dshm" (OuterVolumeSpecName: "dshm") pod "c021e242-9292-4645-9bb7-40807db10b7f" (UID: "c021e242-9292-4645-9bb7-40807db10b7f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:24.155570 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.155542 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c021e242-9292-4645-9bb7-40807db10b7f-kube-api-access-njqp7" (OuterVolumeSpecName: "kube-api-access-njqp7") pod "c021e242-9292-4645-9bb7-40807db10b7f" (UID: "c021e242-9292-4645-9bb7-40807db10b7f"). InnerVolumeSpecName "kube-api-access-njqp7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:49:24.155886 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.155789 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c021e242-9292-4645-9bb7-40807db10b7f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c021e242-9292-4645-9bb7-40807db10b7f" (UID: "c021e242-9292-4645-9bb7-40807db10b7f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:49:24.212227 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.212173 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c021e242-9292-4645-9bb7-40807db10b7f" (UID: "c021e242-9292-4645-9bb7-40807db10b7f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:24.253530 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.253484 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:24.253530 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.253535 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:24.253825 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.253557 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:24.253825 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.253575 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-njqp7\" (UniqueName: \"kubernetes.io/projected/c021e242-9292-4645-9bb7-40807db10b7f-kube-api-access-njqp7\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:24.253825 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.253591 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c021e242-9292-4645-9bb7-40807db10b7f-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:24.253825 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.253608 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e242-9292-4645-9bb7-40807db10b7f-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:24.389076 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.388978 2572 generic.go:358] "Generic (PLEG): container finished" podID="c021e242-9292-4645-9bb7-40807db10b7f" containerID="7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb" exitCode=0 Apr 22 17:49:24.389076 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.389044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" event={"ID":"c021e242-9292-4645-9bb7-40807db10b7f","Type":"ContainerDied","Data":"7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb"} Apr 22 17:49:24.389576 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.389084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" event={"ID":"c021e242-9292-4645-9bb7-40807db10b7f","Type":"ContainerDied","Data":"8e17328c5e29e794d529c603b1c9310714b3a8954e8ac576242ef73c494414f8"} Apr 22 17:49:24.389576 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.389090 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7" Apr 22 17:49:24.389576 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.389099 2572 scope.go:117] "RemoveContainer" containerID="7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb" Apr 22 17:49:24.399417 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.399398 2572 scope.go:117] "RemoveContainer" containerID="1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c" Apr 22 17:49:24.412613 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.412592 2572 scope.go:117] "RemoveContainer" containerID="7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb" Apr 22 17:49:24.412922 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:49:24.412901 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb\": container with ID starting with 7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb not found: ID does not exist" containerID="7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb" Apr 22 17:49:24.413015 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.412928 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb"} err="failed to get container status \"7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb\": rpc error: code = NotFound desc = could not find container \"7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb\": container with ID starting with 7293ceaf34543c98aaace24f5424cdfee25f3040043cb8a7363486fc84b29acb not found: ID does not exist" Apr 22 17:49:24.413015 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.412945 2572 scope.go:117] "RemoveContainer" containerID="1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c" Apr 22 17:49:24.413211 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:49:24.413192 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c\": container with ID starting with 1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c not found: ID does not exist" containerID="1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c" Apr 22 17:49:24.413250 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.413218 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c"} err="failed to get container status \"1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c\": rpc error: code = NotFound desc = could not find container \"1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c\": container with ID starting with 1d8f6469e2e1383e498697fc98c2c0d8b9270e99fe70df8ed147a8b60e6fb50c not found: ID does not exist" Apr 22 17:49:24.417431 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.417413 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7"] Apr 22 17:49:24.423389 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:24.423364 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5bd4d97c96-xvms7"] Apr 22 17:49:25.100161 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:25.100125 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c021e242-9292-4645-9bb7-40807db10b7f" path="/var/lib/kubelet/pods/c021e242-9292-4645-9bb7-40807db10b7f/volumes" Apr 22 17:49:27.057118 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:27.057090 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:49:27.058723 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:27.058684 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:49:27.410871 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:27.410836 2572 generic.go:358] "Generic (PLEG): container finished" podID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" containerID="20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d" exitCode=0 Apr 22 17:49:27.411048 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:27.410908 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" event={"ID":"5e7dfc2f-4352-40f9-9cd0-4e376997393b","Type":"ContainerDied","Data":"20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d"} Apr 22 17:49:28.419272 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:28.419235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" event={"ID":"5e7dfc2f-4352-40f9-9cd0-4e376997393b","Type":"ContainerStarted","Data":"d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88"} Apr 22 17:49:28.439967 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:28.439910 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" podStartSLOduration=6.439895655 podStartE2EDuration="6.439895655s" podCreationTimestamp="2026-04-22 17:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:49:28.438816985 +0000 UTC m=+901.918073696" watchObservedRunningTime="2026-04-22 17:49:28.439895655 +0000 UTC m=+901.919152336" Apr 22 17:49:32.438482 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:32.438443 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:32.438482 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:32.438480 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:32.450622 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:32.450597 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:33.453312 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:33.453278 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:55.705417 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:55.705380 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj"] Apr 22 17:49:55.707895 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:55.705738 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" podUID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" containerName="main" containerID="cri-o://d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88" gracePeriod=30 Apr 22 17:49:55.960683 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:55.960608 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:56.040724 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.040673 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpgkm\" (UniqueName: \"kubernetes.io/projected/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kube-api-access-lpgkm\") pod \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " Apr 22 17:49:56.040724 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.040729 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-home\") pod \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " Apr 22 17:49:56.040939 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.040784 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-dshm\") pod \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " Apr 22 17:49:56.040939 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.040823 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kserve-provision-location\") pod \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " Apr 22 17:49:56.040939 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.040840 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-model-cache\") pod \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " Apr 22 17:49:56.040939 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.040858 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7dfc2f-4352-40f9-9cd0-4e376997393b-tls-certs\") pod \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\" (UID: \"5e7dfc2f-4352-40f9-9cd0-4e376997393b\") " Apr 22 17:49:56.041155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.041007 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-home" (OuterVolumeSpecName: "home") pod "5e7dfc2f-4352-40f9-9cd0-4e376997393b" (UID: "5e7dfc2f-4352-40f9-9cd0-4e376997393b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:56.041215 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.041157 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-model-cache" (OuterVolumeSpecName: "model-cache") pod "5e7dfc2f-4352-40f9-9cd0-4e376997393b" (UID: "5e7dfc2f-4352-40f9-9cd0-4e376997393b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:56.043034 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.042995 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-dshm" (OuterVolumeSpecName: "dshm") pod "5e7dfc2f-4352-40f9-9cd0-4e376997393b" (UID: "5e7dfc2f-4352-40f9-9cd0-4e376997393b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:56.043034 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.043017 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e7dfc2f-4352-40f9-9cd0-4e376997393b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5e7dfc2f-4352-40f9-9cd0-4e376997393b" (UID: "5e7dfc2f-4352-40f9-9cd0-4e376997393b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:49:56.043230 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.043060 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kube-api-access-lpgkm" (OuterVolumeSpecName: "kube-api-access-lpgkm") pod "5e7dfc2f-4352-40f9-9cd0-4e376997393b" (UID: "5e7dfc2f-4352-40f9-9cd0-4e376997393b"). InnerVolumeSpecName "kube-api-access-lpgkm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:49:56.095519 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.095470 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e7dfc2f-4352-40f9-9cd0-4e376997393b" (UID: "5e7dfc2f-4352-40f9-9cd0-4e376997393b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:56.141778 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.141745 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7dfc2f-4352-40f9-9cd0-4e376997393b-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.141778 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.141774 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lpgkm\" (UniqueName: \"kubernetes.io/projected/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kube-api-access-lpgkm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.141778 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.141784 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.142004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.141793 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.142004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.141802 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.142004 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.141810 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7dfc2f-4352-40f9-9cd0-4e376997393b-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.534709 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.534659 2572 generic.go:358] "Generic (PLEG): container finished" podID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" containerID="d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88" exitCode=0 Apr 22 17:49:56.534916 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.534750 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" event={"ID":"5e7dfc2f-4352-40f9-9cd0-4e376997393b","Type":"ContainerDied","Data":"d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88"} Apr 22 17:49:56.534916 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.534784 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" Apr 22 17:49:56.534916 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.534799 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj" event={"ID":"5e7dfc2f-4352-40f9-9cd0-4e376997393b","Type":"ContainerDied","Data":"71037168fb9bf79dcd3e3744ab0debfa88bddf82916b78942fecf86f9b4c04f6"} Apr 22 17:49:56.534916 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.534818 2572 scope.go:117] "RemoveContainer" containerID="d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88" Apr 22 17:49:56.544925 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.544907 2572 scope.go:117] "RemoveContainer" containerID="20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d" Apr 22 17:49:56.559627 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.559609 2572 scope.go:117] "RemoveContainer" containerID="d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88" Apr 22 17:49:56.559961 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:49:56.559933 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88\": container with ID starting with d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88 not found: ID does not exist" containerID="d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88" Apr 22 17:49:56.560055 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.559967 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88"} err="failed to get container status \"d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88\": rpc error: code = NotFound desc = could not find container \"d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88\": container with ID starting with d709dc15685a95553c3ca660b0570e7ec7e736cf130d527af8d0b2cb12bd2f88 not found: ID does not exist" Apr 22 17:49:56.560055 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.559988 2572 scope.go:117] "RemoveContainer" containerID="20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d" Apr 22 17:49:56.560239 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:49:56.560220 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d\": container with ID starting with 20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d not found: ID does not exist" containerID="20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d" Apr 22 17:49:56.560311 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.560245 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d"} err="failed to get container status \"20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d\": rpc error: code = NotFound desc = could not find container \"20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d\": container with ID starting with 20365228f6384f27c496c85db2a2a9a4449ee8faedd72cb72846d4b5c5d0fd6d not found: ID does not exist" Apr 22 17:49:56.565330 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.565304 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj"] Apr 22 17:49:56.568358 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:56.568338 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-2lvcj"] Apr 22 17:49:57.099319 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:49:57.099287 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" path="/var/lib/kubelet/pods/5e7dfc2f-4352-40f9-9cd0-4e376997393b/volumes" Apr 22 17:50:07.305443 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.305406 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr"] Apr 22 17:50:07.306078 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306060 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c021e242-9292-4645-9bb7-40807db10b7f" containerName="storage-initializer" Apr 22 17:50:07.306128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306082 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c021e242-9292-4645-9bb7-40807db10b7f" containerName="storage-initializer" Apr 22 17:50:07.306128 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306124 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c021e242-9292-4645-9bb7-40807db10b7f" containerName="main" Apr 22 17:50:07.306200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306132 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c021e242-9292-4645-9bb7-40807db10b7f" containerName="main" Apr 22 17:50:07.306200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306143 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" containerName="storage-initializer" Apr 22 17:50:07.306200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306152 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" containerName="storage-initializer" Apr 22 17:50:07.306200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306170 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" containerName="main" Apr 22 17:50:07.306200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306177 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" containerName="main" Apr 22 17:50:07.306349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306261 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e7dfc2f-4352-40f9-9cd0-4e376997393b" containerName="main" Apr 22 17:50:07.306349 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.306275 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c021e242-9292-4645-9bb7-40807db10b7f" containerName="main" Apr 22 17:50:07.312008 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.311979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.314486 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.314463 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:50:07.314621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.314498 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 17:50:07.314917 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.314896 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 17:50:07.315263 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.315249 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-v1a2-to-v1a1-kserve-self-signed-certs\"" Apr 22 17:50:07.324651 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.324623 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr"] Apr 22 17:50:07.438984 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.438947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-home\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.438984 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.438990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-model-cache\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.439217 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.439063 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kserve-provision-location\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.439217 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.439133 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6jbk\" (UniqueName: \"kubernetes.io/projected/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kube-api-access-h6jbk\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.439217 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.439162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6beecfa-6255-4f25-bc3a-3e420e2256a1-tls-certs\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.439217 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.439189 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-dshm\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.539836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.539798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6jbk\" (UniqueName: \"kubernetes.io/projected/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kube-api-access-h6jbk\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.539836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.539841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6beecfa-6255-4f25-bc3a-3e420e2256a1-tls-certs\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.540117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.539870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-dshm\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.540117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.539982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-home\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.540117 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.540016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-model-cache\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.540299 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.540135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kserve-provision-location\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.540451 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.540428 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-home\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.540492 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.540465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-model-cache\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.540539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.540491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kserve-provision-location\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.542142 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.542124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-dshm\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.542544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.542522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6beecfa-6255-4f25-bc3a-3e420e2256a1-tls-certs\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.550558 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.550538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6jbk\" (UniqueName: \"kubernetes.io/projected/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kube-api-access-h6jbk\") pod \"conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.625072 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.624975 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:50:07.767250 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.767210 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr"] Apr 22 17:50:07.770111 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:50:07.770080 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6beecfa_6255_4f25_bc3a_3e420e2256a1.slice/crio-146075b9a47e107d4343a35139c3db875de0c717b77898e93f9737e813210745 WatchSource:0}: Error finding container 146075b9a47e107d4343a35139c3db875de0c717b77898e93f9737e813210745: Status 404 returned error can't find the container with id 146075b9a47e107d4343a35139c3db875de0c717b77898e93f9737e813210745 Apr 22 17:50:07.771955 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:07.771935 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:50:08.587691 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:08.587648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" event={"ID":"d6beecfa-6255-4f25-bc3a-3e420e2256a1","Type":"ContainerStarted","Data":"ca78098685e195d67f29ea6bd2884120be4d0755d603b816f57065e1889f81f8"} Apr 22 17:50:08.587691 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:08.587690 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" event={"ID":"d6beecfa-6255-4f25-bc3a-3e420e2256a1","Type":"ContainerStarted","Data":"146075b9a47e107d4343a35139c3db875de0c717b77898e93f9737e813210745"} Apr 22 17:50:12.105581 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.105541 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx"] Apr 22 17:50:12.109342 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.109275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.112144 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.112117 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 17:50:12.123594 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.123563 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx"] Apr 22 17:50:12.283273 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.283234 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-kserve-provision-location\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.283455 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.283281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-dshm\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.283455 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.283334 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-home\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.283557 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.283482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrpn\" (UniqueName: \"kubernetes.io/projected/bfb680cd-4ed2-4380-940b-7557190d098f-kube-api-access-dgrpn\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.283557 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.283517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfb680cd-4ed2-4380-940b-7557190d098f-tls-certs\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.283557 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.283537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-model-cache\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.384748 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.384628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-kserve-provision-location\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.384748 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.384668 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-dshm\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.384748 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.384694 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-home\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.385022 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.384801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrpn\" (UniqueName: \"kubernetes.io/projected/bfb680cd-4ed2-4380-940b-7557190d098f-kube-api-access-dgrpn\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.385022 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.384827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfb680cd-4ed2-4380-940b-7557190d098f-tls-certs\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.385022 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.384853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-model-cache\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.385171 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.385116 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-kserve-provision-location\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.385171 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.385161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-home\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.385299 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.385279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-model-cache\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.387208 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.387187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-dshm\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.387474 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.387456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfb680cd-4ed2-4380-940b-7557190d098f-tls-certs\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.415198 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.415162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrpn\" (UniqueName: \"kubernetes.io/projected/bfb680cd-4ed2-4380-940b-7557190d098f-kube-api-access-dgrpn\") pod \"stop-feature-test-kserve-869cf6fb9f-sdpgx\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.422067 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.422038 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:50:12.556272 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.556242 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx"] Apr 22 17:50:12.557836 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:50:12.557804 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfb680cd_4ed2_4380_940b_7557190d098f.slice/crio-6161b138cb7411ab614dd2ab4a6833a3b65e7700de2500d638f37ff191eadb08 WatchSource:0}: Error finding container 6161b138cb7411ab614dd2ab4a6833a3b65e7700de2500d638f37ff191eadb08: Status 404 returned error can't find the container with id 6161b138cb7411ab614dd2ab4a6833a3b65e7700de2500d638f37ff191eadb08 Apr 22 17:50:12.605552 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.605520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" event={"ID":"bfb680cd-4ed2-4380-940b-7557190d098f","Type":"ContainerStarted","Data":"6161b138cb7411ab614dd2ab4a6833a3b65e7700de2500d638f37ff191eadb08"} Apr 22 17:50:12.606899 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.606871 2572 generic.go:358] "Generic (PLEG): container finished" podID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" containerID="ca78098685e195d67f29ea6bd2884120be4d0755d603b816f57065e1889f81f8" exitCode=0 Apr 22 17:50:12.607032 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:12.606937 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" event={"ID":"d6beecfa-6255-4f25-bc3a-3e420e2256a1","Type":"ContainerDied","Data":"ca78098685e195d67f29ea6bd2884120be4d0755d603b816f57065e1889f81f8"} Apr 22 17:50:13.381552 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:13.381489 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr"] Apr 22 17:50:13.613841 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:13.613796 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" event={"ID":"bfb680cd-4ed2-4380-940b-7557190d098f","Type":"ContainerStarted","Data":"e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60"} Apr 22 17:50:17.633805 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:17.633719 2572 generic.go:358] "Generic (PLEG): container finished" podID="bfb680cd-4ed2-4380-940b-7557190d098f" containerID="e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60" exitCode=0 Apr 22 17:50:17.633805 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:50:17.633779 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" event={"ID":"bfb680cd-4ed2-4380-940b-7557190d098f","Type":"ContainerDied","Data":"e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60"} Apr 22 17:51:01.853439 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:01.853401 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" event={"ID":"d6beecfa-6255-4f25-bc3a-3e420e2256a1","Type":"ContainerStarted","Data":"c570224f262153fe53a246b8b21488488a015bf888ed84985bc3a4861176600c"} Apr 22 17:51:01.853866 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:01.853534 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" podUID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" containerName="main" containerID="cri-o://c570224f262153fe53a246b8b21488488a015bf888ed84985bc3a4861176600c" gracePeriod=30 Apr 22 17:51:01.874562 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:01.874501 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" podStartSLOduration=5.800647966 podStartE2EDuration="54.874483049s" podCreationTimestamp="2026-04-22 17:50:07 +0000 UTC" firstStartedPulling="2026-04-22 17:50:12.608022022 +0000 UTC m=+946.087278688" lastFinishedPulling="2026-04-22 17:51:01.681857106 +0000 UTC m=+995.161113771" observedRunningTime="2026-04-22 17:51:01.871768018 +0000 UTC m=+995.351024703" watchObservedRunningTime="2026-04-22 17:51:01.874483049 +0000 UTC m=+995.353739732" Apr 22 17:51:02.860239 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:02.860201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" event={"ID":"bfb680cd-4ed2-4380-940b-7557190d098f","Type":"ContainerStarted","Data":"e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0"} Apr 22 17:51:02.889636 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:02.889564 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podStartSLOduration=6.759775488 podStartE2EDuration="50.889544325s" podCreationTimestamp="2026-04-22 17:50:12 +0000 UTC" firstStartedPulling="2026-04-22 17:50:17.635146066 +0000 UTC m=+951.114402731" lastFinishedPulling="2026-04-22 17:51:01.764914895 +0000 UTC m=+995.244171568" observedRunningTime="2026-04-22 17:51:02.885389404 +0000 UTC m=+996.364646086" watchObservedRunningTime="2026-04-22 17:51:02.889544325 +0000 UTC m=+996.368801008" Apr 22 17:51:07.625532 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:07.625483 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:51:12.423188 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:12.423150 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:51:12.423621 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:12.423334 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:51:12.425245 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:12.425209 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:51:22.422786 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:22.422734 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:51:31.998072 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:31.998046 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr_d6beecfa-6255-4f25-bc3a-3e420e2256a1/main/0.log" Apr 22 17:51:31.998484 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:31.998374 2572 generic.go:358] "Generic (PLEG): container finished" podID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" containerID="c570224f262153fe53a246b8b21488488a015bf888ed84985bc3a4861176600c" exitCode=137 Apr 22 17:51:31.998484 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:31.998455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" event={"ID":"d6beecfa-6255-4f25-bc3a-3e420e2256a1","Type":"ContainerDied","Data":"c570224f262153fe53a246b8b21488488a015bf888ed84985bc3a4861176600c"} Apr 22 17:51:32.073985 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.073958 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr_d6beecfa-6255-4f25-bc3a-3e420e2256a1/main/0.log" Apr 22 17:51:32.074319 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.074301 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:51:32.156343 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.156301 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-dshm\") pod \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " Apr 22 17:51:32.156560 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.156383 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6jbk\" (UniqueName: \"kubernetes.io/projected/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kube-api-access-h6jbk\") pod \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " Apr 22 17:51:32.156560 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.156405 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-model-cache\") pod \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " Apr 22 17:51:32.156560 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.156453 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kserve-provision-location\") pod \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " Apr 22 17:51:32.156560 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.156477 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6beecfa-6255-4f25-bc3a-3e420e2256a1-tls-certs\") pod \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " Apr 22 17:51:32.156560 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.156549 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-home\") pod \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\" (UID: \"d6beecfa-6255-4f25-bc3a-3e420e2256a1\") " Apr 22 17:51:32.156890 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.156759 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-home" (OuterVolumeSpecName: "home") pod "d6beecfa-6255-4f25-bc3a-3e420e2256a1" (UID: "d6beecfa-6255-4f25-bc3a-3e420e2256a1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:51:32.156890 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.156767 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-model-cache" (OuterVolumeSpecName: "model-cache") pod "d6beecfa-6255-4f25-bc3a-3e420e2256a1" (UID: "d6beecfa-6255-4f25-bc3a-3e420e2256a1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:51:32.158718 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.158679 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-dshm" (OuterVolumeSpecName: "dshm") pod "d6beecfa-6255-4f25-bc3a-3e420e2256a1" (UID: "d6beecfa-6255-4f25-bc3a-3e420e2256a1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:51:32.159132 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.159109 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6beecfa-6255-4f25-bc3a-3e420e2256a1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d6beecfa-6255-4f25-bc3a-3e420e2256a1" (UID: "d6beecfa-6255-4f25-bc3a-3e420e2256a1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:51:32.159236 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.159126 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kube-api-access-h6jbk" (OuterVolumeSpecName: "kube-api-access-h6jbk") pod "d6beecfa-6255-4f25-bc3a-3e420e2256a1" (UID: "d6beecfa-6255-4f25-bc3a-3e420e2256a1"). InnerVolumeSpecName "kube-api-access-h6jbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:51:32.211354 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.211289 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d6beecfa-6255-4f25-bc3a-3e420e2256a1" (UID: "d6beecfa-6255-4f25-bc3a-3e420e2256a1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:51:32.258024 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.257978 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6jbk\" (UniqueName: \"kubernetes.io/projected/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kube-api-access-h6jbk\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:51:32.258024 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.258020 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:51:32.258024 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.258036 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:51:32.258318 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.258050 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6beecfa-6255-4f25-bc3a-3e420e2256a1-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:51:32.258318 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.258067 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:51:32.258318 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.258078 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6beecfa-6255-4f25-bc3a-3e420e2256a1-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:51:32.423391 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:32.423278 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:51:33.004859 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:33.004834 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr_d6beecfa-6255-4f25-bc3a-3e420e2256a1/main/0.log" Apr 22 17:51:33.005311 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:33.005272 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" Apr 22 17:51:33.005311 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:33.005271 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr" event={"ID":"d6beecfa-6255-4f25-bc3a-3e420e2256a1","Type":"ContainerDied","Data":"146075b9a47e107d4343a35139c3db875de0c717b77898e93f9737e813210745"} Apr 22 17:51:33.005396 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:33.005327 2572 scope.go:117] "RemoveContainer" containerID="c570224f262153fe53a246b8b21488488a015bf888ed84985bc3a4861176600c" Apr 22 17:51:33.015295 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:33.015276 2572 scope.go:117] "RemoveContainer" containerID="ca78098685e195d67f29ea6bd2884120be4d0755d603b816f57065e1889f81f8" Apr 22 17:51:33.028832 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:33.028688 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr"] Apr 22 17:51:33.032547 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:33.032523 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-v1a2-to-v1a1-kserve-9c4b9c55b-j5gtr"] Apr 22 17:51:33.100231 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:33.100201 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" path="/var/lib/kubelet/pods/d6beecfa-6255-4f25-bc3a-3e420e2256a1/volumes" Apr 22 17:51:42.422956 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:42.422913 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:51:52.422922 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:51:52.422871 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:52:02.422997 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:02.422943 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:52:12.423200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:12.423149 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:52:22.422684 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:22.422580 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:52:32.423093 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:32.423041 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 22 17:52:42.432194 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:42.432155 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:52:42.439836 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:42.439809 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:52:56.203980 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.203947 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq"] Apr 22 17:52:56.204434 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.204318 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" containerName="storage-initializer" Apr 22 17:52:56.204434 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.204330 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" containerName="storage-initializer" Apr 22 17:52:56.204434 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.204366 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" containerName="main" Apr 22 17:52:56.204434 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.204372 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" containerName="main" Apr 22 17:52:56.204569 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.204446 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6beecfa-6255-4f25-bc3a-3e420e2256a1" containerName="main" Apr 22 17:52:56.207599 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.207584 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.209880 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.209864 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 17:52:56.218116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.218095 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq"] Apr 22 17:52:56.305742 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.305677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcrpj\" (UniqueName: \"kubernetes.io/projected/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kube-api-access-kcrpj\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.305888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.305766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.305888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.305786 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.305888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.305813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.305997 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.305881 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.305997 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.305932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/234f3d9e-ee85-48fb-8c8b-4296725b4b43-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.406885 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.406856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcrpj\" (UniqueName: \"kubernetes.io/projected/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kube-api-access-kcrpj\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.407078 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.406900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.407078 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.406923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.407207 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.407111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.407207 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.407161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.407207 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.407198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/234f3d9e-ee85-48fb-8c8b-4296725b4b43-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.407369 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.407305 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.407605 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.407576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.407756 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.407609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.409433 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.409404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.409772 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.409754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/234f3d9e-ee85-48fb-8c8b-4296725b4b43-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.424000 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.423979 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcrpj\" (UniqueName: \"kubernetes.io/projected/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kube-api-access-kcrpj\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.518989 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.518903 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:52:56.647966 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:56.647933 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq"] Apr 22 17:52:56.650995 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:52:56.650966 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234f3d9e_ee85_48fb_8c8b_4296725b4b43.slice/crio-06ed5d099adf0db000f6203e43cc175f8b8f274fc47d26e1ca6d1ceb8b451e0a WatchSource:0}: Error finding container 06ed5d099adf0db000f6203e43cc175f8b8f274fc47d26e1ca6d1ceb8b451e0a: Status 404 returned error can't find the container with id 06ed5d099adf0db000f6203e43cc175f8b8f274fc47d26e1ca6d1ceb8b451e0a Apr 22 17:52:57.336518 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:57.336486 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" event={"ID":"234f3d9e-ee85-48fb-8c8b-4296725b4b43","Type":"ContainerStarted","Data":"b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb"} Apr 22 17:52:57.336518 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:52:57.336521 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" event={"ID":"234f3d9e-ee85-48fb-8c8b-4296725b4b43","Type":"ContainerStarted","Data":"06ed5d099adf0db000f6203e43cc175f8b8f274fc47d26e1ca6d1ceb8b451e0a"} Apr 22 17:53:01.356845 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:01.356763 2572 generic.go:358] "Generic (PLEG): container finished" podID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerID="b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb" exitCode=0 Apr 22 17:53:01.357267 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:01.356841 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" event={"ID":"234f3d9e-ee85-48fb-8c8b-4296725b4b43","Type":"ContainerDied","Data":"b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb"} Apr 22 17:53:02.362466 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:02.362430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" event={"ID":"234f3d9e-ee85-48fb-8c8b-4296725b4b43","Type":"ContainerStarted","Data":"1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b"} Apr 22 17:53:02.382261 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:02.382212 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podStartSLOduration=6.382199442 podStartE2EDuration="6.382199442s" podCreationTimestamp="2026-04-22 17:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:02.381133811 +0000 UTC m=+1115.860390494" watchObservedRunningTime="2026-04-22 17:53:02.382199442 +0000 UTC m=+1115.861456125" Apr 22 17:53:04.581263 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:04.581231 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx"] Apr 22 17:53:04.581625 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:04.581509 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" containerID="cri-o://e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0" gracePeriod=30 Apr 22 17:53:06.519801 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:06.519762 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:53:06.519801 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:06.519808 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:53:06.521464 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:06.521429 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:53:13.298094 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.298058 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs"] Apr 22 17:53:13.333352 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.333323 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs"] Apr 22 17:53:13.333498 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.333438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.460946 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.460910 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9sgz\" (UniqueName: \"kubernetes.io/projected/29e1b1ee-f679-47a5-b568-dea36a0e8530-kube-api-access-g9sgz\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.460946 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.460948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-dshm\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.461165 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.461038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/29e1b1ee-f679-47a5-b568-dea36a0e8530-tls-certs\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.461165 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.461127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-kserve-provision-location\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.461165 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.461160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-home\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.461284 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.461179 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-model-cache\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.562572 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.562495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-home\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.562572 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.562531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-model-cache\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.562818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.562594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9sgz\" (UniqueName: \"kubernetes.io/projected/29e1b1ee-f679-47a5-b568-dea36a0e8530-kube-api-access-g9sgz\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.562818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.562613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-dshm\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.562818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.562656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/29e1b1ee-f679-47a5-b568-dea36a0e8530-tls-certs\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.562818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.562688 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-kserve-provision-location\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.563047 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.563016 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-home\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.563114 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.563052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-model-cache\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.563242 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.563220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-kserve-provision-location\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.564899 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.564878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-dshm\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.565060 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.565043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/29e1b1ee-f679-47a5-b568-dea36a0e8530-tls-certs\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.571046 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.571019 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9sgz\" (UniqueName: \"kubernetes.io/projected/29e1b1ee-f679-47a5-b568-dea36a0e8530-kube-api-access-g9sgz\") pod \"stop-feature-test-kserve-869cf6fb9f-p6pjs\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.644442 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.644412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:13.781437 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:13.781413 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs"] Apr 22 17:53:13.782847 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:53:13.782821 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29e1b1ee_f679_47a5_b568_dea36a0e8530.slice/crio-a5b0a0f8e38b8751d2d8e8470b96cff42115ea2deaf46c4a25b8ce7233942ea0 WatchSource:0}: Error finding container a5b0a0f8e38b8751d2d8e8470b96cff42115ea2deaf46c4a25b8ce7233942ea0: Status 404 returned error can't find the container with id a5b0a0f8e38b8751d2d8e8470b96cff42115ea2deaf46c4a25b8ce7233942ea0 Apr 22 17:53:14.416977 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:14.416933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" event={"ID":"29e1b1ee-f679-47a5-b568-dea36a0e8530","Type":"ContainerStarted","Data":"92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d"} Apr 22 17:53:14.416977 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:14.416981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" event={"ID":"29e1b1ee-f679-47a5-b568-dea36a0e8530","Type":"ContainerStarted","Data":"a5b0a0f8e38b8751d2d8e8470b96cff42115ea2deaf46c4a25b8ce7233942ea0"} Apr 22 17:53:16.519743 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:16.519679 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:53:18.438013 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:18.437979 2572 generic.go:358] "Generic (PLEG): container finished" podID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerID="92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d" exitCode=0 Apr 22 17:53:18.438386 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:18.438049 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" event={"ID":"29e1b1ee-f679-47a5-b568-dea36a0e8530","Type":"ContainerDied","Data":"92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d"} Apr 22 17:53:19.444484 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:19.444444 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" event={"ID":"29e1b1ee-f679-47a5-b568-dea36a0e8530","Type":"ContainerStarted","Data":"fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e"} Apr 22 17:53:19.467464 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:19.467410 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podStartSLOduration=6.467396164 podStartE2EDuration="6.467396164s" podCreationTimestamp="2026-04-22 17:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:19.464573996 +0000 UTC m=+1132.943830678" watchObservedRunningTime="2026-04-22 17:53:19.467396164 +0000 UTC m=+1132.946652845" Apr 22 17:53:23.645202 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:23.645155 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:23.645630 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:23.645221 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:53:23.646953 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:23.646924 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:53:26.520371 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:26.520325 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:53:33.645833 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:33.645781 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:53:34.835065 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.835003 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-869cf6fb9f-sdpgx_bfb680cd-4ed2-4380-940b-7557190d098f/main/0.log" Apr 22 17:53:34.835411 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.835402 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:53:34.876474 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.876432 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgrpn\" (UniqueName: \"kubernetes.io/projected/bfb680cd-4ed2-4380-940b-7557190d098f-kube-api-access-dgrpn\") pod \"bfb680cd-4ed2-4380-940b-7557190d098f\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " Apr 22 17:53:34.876669 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.876517 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-dshm\") pod \"bfb680cd-4ed2-4380-940b-7557190d098f\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " Apr 22 17:53:34.876669 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.876589 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-model-cache\") pod \"bfb680cd-4ed2-4380-940b-7557190d098f\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " Apr 22 17:53:34.876669 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.876612 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfb680cd-4ed2-4380-940b-7557190d098f-tls-certs\") pod \"bfb680cd-4ed2-4380-940b-7557190d098f\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " Apr 22 17:53:34.876669 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.876644 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-home\") pod \"bfb680cd-4ed2-4380-940b-7557190d098f\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " Apr 22 17:53:34.876956 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.876684 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-kserve-provision-location\") pod \"bfb680cd-4ed2-4380-940b-7557190d098f\" (UID: \"bfb680cd-4ed2-4380-940b-7557190d098f\") " Apr 22 17:53:34.877457 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.877401 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-model-cache" (OuterVolumeSpecName: "model-cache") pod "bfb680cd-4ed2-4380-940b-7557190d098f" (UID: "bfb680cd-4ed2-4380-940b-7557190d098f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:53:34.877591 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.877485 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-home" (OuterVolumeSpecName: "home") pod "bfb680cd-4ed2-4380-940b-7557190d098f" (UID: "bfb680cd-4ed2-4380-940b-7557190d098f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:53:34.880013 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.879975 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-dshm" (OuterVolumeSpecName: "dshm") pod "bfb680cd-4ed2-4380-940b-7557190d098f" (UID: "bfb680cd-4ed2-4380-940b-7557190d098f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:53:34.880116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.880033 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfb680cd-4ed2-4380-940b-7557190d098f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bfb680cd-4ed2-4380-940b-7557190d098f" (UID: "bfb680cd-4ed2-4380-940b-7557190d098f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:53:34.882035 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.882011 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb680cd-4ed2-4380-940b-7557190d098f-kube-api-access-dgrpn" (OuterVolumeSpecName: "kube-api-access-dgrpn") pod "bfb680cd-4ed2-4380-940b-7557190d098f" (UID: "bfb680cd-4ed2-4380-940b-7557190d098f"). InnerVolumeSpecName "kube-api-access-dgrpn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:53:34.944414 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.944371 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bfb680cd-4ed2-4380-940b-7557190d098f" (UID: "bfb680cd-4ed2-4380-940b-7557190d098f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:53:34.978002 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.977960 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:53:34.978002 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.978004 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfb680cd-4ed2-4380-940b-7557190d098f-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:53:34.978194 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.978019 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:53:34.978194 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.978034 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:53:34.978194 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.978051 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dgrpn\" (UniqueName: \"kubernetes.io/projected/bfb680cd-4ed2-4380-940b-7557190d098f-kube-api-access-dgrpn\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:53:34.978194 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:34.978065 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfb680cd-4ed2-4380-940b-7557190d098f-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:53:35.519111 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.519074 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-869cf6fb9f-sdpgx_bfb680cd-4ed2-4380-940b-7557190d098f/main/0.log" Apr 22 17:53:35.519459 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.519431 2572 generic.go:358] "Generic (PLEG): container finished" podID="bfb680cd-4ed2-4380-940b-7557190d098f" containerID="e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0" exitCode=137 Apr 22 17:53:35.519533 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.519507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" event={"ID":"bfb680cd-4ed2-4380-940b-7557190d098f","Type":"ContainerDied","Data":"e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0"} Apr 22 17:53:35.519575 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.519542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" event={"ID":"bfb680cd-4ed2-4380-940b-7557190d098f","Type":"ContainerDied","Data":"6161b138cb7411ab614dd2ab4a6833a3b65e7700de2500d638f37ff191eadb08"} Apr 22 17:53:35.519575 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.519548 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx" Apr 22 17:53:35.519636 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.519559 2572 scope.go:117] "RemoveContainer" containerID="e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0" Apr 22 17:53:35.538804 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.538775 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx"] Apr 22 17:53:35.540985 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.540960 2572 scope.go:117] "RemoveContainer" containerID="e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60" Apr 22 17:53:35.542584 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.542559 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-sdpgx"] Apr 22 17:53:35.622091 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.621932 2572 scope.go:117] "RemoveContainer" containerID="e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0" Apr 22 17:53:35.622327 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:53:35.622306 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0\": container with ID starting with e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0 not found: ID does not exist" containerID="e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0" Apr 22 17:53:35.622404 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.622341 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0"} err="failed to get container status \"e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0\": rpc error: code = NotFound desc = could not find container \"e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0\": container with ID starting with e7ad2a8362e93e0161f32bf02b92d6616e3baf10eb5b692cbcf6b1d6978904f0 not found: ID does not exist" Apr 22 17:53:35.622404 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.622371 2572 scope.go:117] "RemoveContainer" containerID="e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60" Apr 22 17:53:35.622690 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:53:35.622659 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60\": container with ID starting with e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60 not found: ID does not exist" containerID="e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60" Apr 22 17:53:35.622783 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:35.622715 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60"} err="failed to get container status \"e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60\": rpc error: code = NotFound desc = could not find container \"e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60\": container with ID starting with e7303a13aecf7e959aa51e429f3a1e9a54b819e33d63e6afade732381f296f60 not found: ID does not exist" Apr 22 17:53:36.519447 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:36.519406 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:53:37.100015 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:37.099970 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" path="/var/lib/kubelet/pods/bfb680cd-4ed2-4380-940b-7557190d098f/volumes" Apr 22 17:53:43.644826 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:43.644774 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:53:46.519818 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:46.519720 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:53:53.645395 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:53.645257 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:53:56.520405 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:53:56.520353 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:54:03.645806 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:03.645752 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:54:06.520430 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:06.520382 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:54:13.644854 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:13.644804 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:54:16.520075 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:16.520028 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:54:23.645561 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:23.645515 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:54:26.519928 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:26.519883 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:54:27.098082 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:27.096678 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:54:27.107622 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:27.107598 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:54:33.645334 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:33.645286 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:54:36.519334 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:36.519289 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:54:43.644683 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:43.644645 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:54:46.519761 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:46.519713 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 22 17:54:53.645388 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:53.645338 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:54:56.529825 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:56.529791 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:54:56.537709 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:54:56.537666 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:55:03.645677 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:03.645626 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:55:05.506673 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:05.506638 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq"] Apr 22 17:55:05.507143 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:05.506986 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" containerID="cri-o://1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b" gracePeriod=30 Apr 22 17:55:13.644918 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:13.644873 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 17:55:18.492061 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.491987 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4"] Apr 22 17:55:18.492422 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.492376 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" Apr 22 17:55:18.492422 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.492387 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" Apr 22 17:55:18.492422 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.492403 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="storage-initializer" Apr 22 17:55:18.492422 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.492408 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="storage-initializer" Apr 22 17:55:18.492570 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.492478 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bfb680cd-4ed2-4380-940b-7557190d098f" containerName="main" Apr 22 17:55:18.497084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.497058 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.499389 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.499368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 17:55:18.508560 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.508534 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4"] Apr 22 17:55:18.644429 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.644391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-dshm\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.644618 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.644439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.644618 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.644527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-home\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.644618 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.644581 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-model-cache\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.644776 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.644616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69nn\" (UniqueName: \"kubernetes.io/projected/ca75965a-862d-492d-b120-f7f935c93947-kube-api-access-q69nn\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.644776 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.644652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca75965a-862d-492d-b120-f7f935c93947-tls-certs\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.745524 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.745442 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-dshm\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.745524 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.745487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.745774 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.745550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-home\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.745774 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.745576 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-model-cache\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.745774 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.745607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q69nn\" (UniqueName: \"kubernetes.io/projected/ca75965a-862d-492d-b120-f7f935c93947-kube-api-access-q69nn\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.745996 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.745965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca75965a-862d-492d-b120-f7f935c93947-tls-certs\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.746198 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.746172 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-home\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.746315 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.746208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-model-cache\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.746453 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.746414 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.748687 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.748666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-dshm\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.748808 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.748779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca75965a-862d-492d-b120-f7f935c93947-tls-certs\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.761947 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.761911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69nn\" (UniqueName: \"kubernetes.io/projected/ca75965a-862d-492d-b120-f7f935c93947-kube-api-access-q69nn\") pod \"custom-route-timeout-test-kserve-66d46d49f5-gcwg4\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.809447 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.809411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:18.989601 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.989565 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4"] Apr 22 17:55:18.990926 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:55:18.990896 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca75965a_862d_492d_b120_f7f935c93947.slice/crio-a406543275b1dd722ac9088b27cc3eda6dde84b82cdc792e47c79fa91c00a656 WatchSource:0}: Error finding container a406543275b1dd722ac9088b27cc3eda6dde84b82cdc792e47c79fa91c00a656: Status 404 returned error can't find the container with id a406543275b1dd722ac9088b27cc3eda6dde84b82cdc792e47c79fa91c00a656 Apr 22 17:55:18.992996 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:18.992978 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:55:19.970006 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:19.969964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" event={"ID":"ca75965a-862d-492d-b120-f7f935c93947","Type":"ContainerStarted","Data":"ac1b89a508f30711364e5a9d4b748525fcd54a837bc3455349e41502f6e03cbb"} Apr 22 17:55:19.970394 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:19.970014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" event={"ID":"ca75965a-862d-492d-b120-f7f935c93947","Type":"ContainerStarted","Data":"a406543275b1dd722ac9088b27cc3eda6dde84b82cdc792e47c79fa91c00a656"} Apr 22 17:55:23.654512 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:23.654477 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:55:23.662279 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:23.662245 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:55:23.997560 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:23.997471 2572 generic.go:358] "Generic (PLEG): container finished" podID="ca75965a-862d-492d-b120-f7f935c93947" containerID="ac1b89a508f30711364e5a9d4b748525fcd54a837bc3455349e41502f6e03cbb" exitCode=0 Apr 22 17:55:23.997560 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:23.997544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" event={"ID":"ca75965a-862d-492d-b120-f7f935c93947","Type":"ContainerDied","Data":"ac1b89a508f30711364e5a9d4b748525fcd54a837bc3455349e41502f6e03cbb"} Apr 22 17:55:25.003361 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:25.003318 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" event={"ID":"ca75965a-862d-492d-b120-f7f935c93947","Type":"ContainerStarted","Data":"4add97eb8dcf5aa4ba6a025482cb7c336f41715620576256296b4c5b47785627"} Apr 22 17:55:25.024895 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:25.024838 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podStartSLOduration=7.024820074 podStartE2EDuration="7.024820074s" podCreationTimestamp="2026-04-22 17:55:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:25.021752742 +0000 UTC m=+1258.501009423" watchObservedRunningTime="2026-04-22 17:55:25.024820074 +0000 UTC m=+1258.504076757" Apr 22 17:55:28.810273 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:28.810233 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:28.810752 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:28.810286 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:55:28.811959 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:28.811925 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:55:35.758948 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.758918 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq_234f3d9e-ee85-48fb-8c8b-4296725b4b43/main/0.log" Apr 22 17:55:35.759373 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.759356 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:55:35.841308 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841229 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-model-cache\") pod \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " Apr 22 17:55:35.841308 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841269 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcrpj\" (UniqueName: \"kubernetes.io/projected/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kube-api-access-kcrpj\") pod \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " Apr 22 17:55:35.841308 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841294 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/234f3d9e-ee85-48fb-8c8b-4296725b4b43-tls-certs\") pod \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " Apr 22 17:55:35.841607 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841333 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kserve-provision-location\") pod \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " Apr 22 17:55:35.841607 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841415 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-home\") pod \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " Apr 22 17:55:35.841607 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841440 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-dshm\") pod \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\" (UID: \"234f3d9e-ee85-48fb-8c8b-4296725b4b43\") " Apr 22 17:55:35.841607 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841554 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-model-cache" (OuterVolumeSpecName: "model-cache") pod "234f3d9e-ee85-48fb-8c8b-4296725b4b43" (UID: "234f3d9e-ee85-48fb-8c8b-4296725b4b43"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:35.841866 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841816 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:55:35.841923 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.841893 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-home" (OuterVolumeSpecName: "home") pod "234f3d9e-ee85-48fb-8c8b-4296725b4b43" (UID: "234f3d9e-ee85-48fb-8c8b-4296725b4b43"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:35.843578 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.843548 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kube-api-access-kcrpj" (OuterVolumeSpecName: "kube-api-access-kcrpj") pod "234f3d9e-ee85-48fb-8c8b-4296725b4b43" (UID: "234f3d9e-ee85-48fb-8c8b-4296725b4b43"). InnerVolumeSpecName "kube-api-access-kcrpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:55:35.843683 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.843597 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/234f3d9e-ee85-48fb-8c8b-4296725b4b43-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "234f3d9e-ee85-48fb-8c8b-4296725b4b43" (UID: "234f3d9e-ee85-48fb-8c8b-4296725b4b43"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:35.843683 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.843603 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-dshm" (OuterVolumeSpecName: "dshm") pod "234f3d9e-ee85-48fb-8c8b-4296725b4b43" (UID: "234f3d9e-ee85-48fb-8c8b-4296725b4b43"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:35.902369 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.902312 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "234f3d9e-ee85-48fb-8c8b-4296725b4b43" (UID: "234f3d9e-ee85-48fb-8c8b-4296725b4b43"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:35.943045 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.943004 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kcrpj\" (UniqueName: \"kubernetes.io/projected/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kube-api-access-kcrpj\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:55:35.943045 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.943039 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/234f3d9e-ee85-48fb-8c8b-4296725b4b43-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:55:35.943045 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.943055 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:55:35.943288 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.943067 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:55:35.943288 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:35.943081 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/234f3d9e-ee85-48fb-8c8b-4296725b4b43-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:55:36.051472 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.051442 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq_234f3d9e-ee85-48fb-8c8b-4296725b4b43/main/0.log" Apr 22 17:55:36.051806 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.051782 2572 generic.go:358] "Generic (PLEG): container finished" podID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerID="1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b" exitCode=137 Apr 22 17:55:36.051922 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.051857 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" Apr 22 17:55:36.051922 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.051868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" event={"ID":"234f3d9e-ee85-48fb-8c8b-4296725b4b43","Type":"ContainerDied","Data":"1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b"} Apr 22 17:55:36.051922 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.051910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq" event={"ID":"234f3d9e-ee85-48fb-8c8b-4296725b4b43","Type":"ContainerDied","Data":"06ed5d099adf0db000f6203e43cc175f8b8f274fc47d26e1ca6d1ceb8b451e0a"} Apr 22 17:55:36.052052 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.051927 2572 scope.go:117] "RemoveContainer" containerID="1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b" Apr 22 17:55:36.072955 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.072930 2572 scope.go:117] "RemoveContainer" containerID="b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb" Apr 22 17:55:36.074545 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.074523 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq"] Apr 22 17:55:36.081980 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.081954 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c69d77f97mhfgq"] Apr 22 17:55:36.131685 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.131662 2572 scope.go:117] "RemoveContainer" containerID="1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b" Apr 22 17:55:36.132071 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:55:36.132045 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b\": container with ID starting with 1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b not found: ID does not exist" containerID="1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b" Apr 22 17:55:36.132127 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.132081 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b"} err="failed to get container status \"1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b\": rpc error: code = NotFound desc = could not find container \"1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b\": container with ID starting with 1d90367859e399651dca7982f100fd007b4dd319aa5d64fdd53e571c44ff058b not found: ID does not exist" Apr 22 17:55:36.132127 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.132101 2572 scope.go:117] "RemoveContainer" containerID="b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb" Apr 22 17:55:36.132376 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:55:36.132359 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb\": container with ID starting with b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb not found: ID does not exist" containerID="b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb" Apr 22 17:55:36.132430 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:36.132380 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb"} err="failed to get container status \"b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb\": rpc error: code = NotFound desc = could not find container \"b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb\": container with ID starting with b83ffe7ae927d22bcbb7b8deb14f885d6c7ebc72d5aff5fe56aa390cdcfbb3cb not found: ID does not exist" Apr 22 17:55:37.100793 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:37.100754 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" path="/var/lib/kubelet/pods/234f3d9e-ee85-48fb-8c8b-4296725b4b43/volumes" Apr 22 17:55:38.810815 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:38.810763 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:55:40.893677 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:40.893641 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs"] Apr 22 17:55:40.894171 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:40.893950 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" containerID="cri-o://fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e" gracePeriod=30 Apr 22 17:55:48.810671 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:48.810620 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:55:58.810527 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:55:58.810478 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:56:08.809900 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:08.809843 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:56:11.169188 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.169164 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-869cf6fb9f-p6pjs_29e1b1ee-f679-47a5-b568-dea36a0e8530/main/0.log" Apr 22 17:56:11.169578 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.169559 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:56:11.198140 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.198113 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-869cf6fb9f-p6pjs_29e1b1ee-f679-47a5-b568-dea36a0e8530/main/0.log" Apr 22 17:56:11.198509 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.198486 2572 generic.go:358] "Generic (PLEG): container finished" podID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerID="fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e" exitCode=137 Apr 22 17:56:11.198607 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.198563 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" Apr 22 17:56:11.198607 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.198570 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" event={"ID":"29e1b1ee-f679-47a5-b568-dea36a0e8530","Type":"ContainerDied","Data":"fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e"} Apr 22 17:56:11.198686 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.198620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs" event={"ID":"29e1b1ee-f679-47a5-b568-dea36a0e8530","Type":"ContainerDied","Data":"a5b0a0f8e38b8751d2d8e8470b96cff42115ea2deaf46c4a25b8ce7233942ea0"} Apr 22 17:56:11.198686 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.198641 2572 scope.go:117] "RemoveContainer" containerID="fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e" Apr 22 17:56:11.218841 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.218783 2572 scope.go:117] "RemoveContainer" containerID="92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d" Apr 22 17:56:11.265074 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.265037 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-home\") pod \"29e1b1ee-f679-47a5-b568-dea36a0e8530\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " Apr 22 17:56:11.265271 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.265090 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/29e1b1ee-f679-47a5-b568-dea36a0e8530-tls-certs\") pod \"29e1b1ee-f679-47a5-b568-dea36a0e8530\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " Apr 22 17:56:11.265271 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.265185 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-dshm\") pod \"29e1b1ee-f679-47a5-b568-dea36a0e8530\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " Apr 22 17:56:11.265271 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.265212 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-model-cache\") pod \"29e1b1ee-f679-47a5-b568-dea36a0e8530\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " Apr 22 17:56:11.265271 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.265244 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9sgz\" (UniqueName: \"kubernetes.io/projected/29e1b1ee-f679-47a5-b568-dea36a0e8530-kube-api-access-g9sgz\") pod \"29e1b1ee-f679-47a5-b568-dea36a0e8530\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " Apr 22 17:56:11.265496 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.265290 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-kserve-provision-location\") pod \"29e1b1ee-f679-47a5-b568-dea36a0e8530\" (UID: \"29e1b1ee-f679-47a5-b568-dea36a0e8530\") " Apr 22 17:56:11.265496 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.265443 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-home" (OuterVolumeSpecName: "home") pod "29e1b1ee-f679-47a5-b568-dea36a0e8530" (UID: "29e1b1ee-f679-47a5-b568-dea36a0e8530"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:56:11.265609 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.265546 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-model-cache" (OuterVolumeSpecName: "model-cache") pod "29e1b1ee-f679-47a5-b568-dea36a0e8530" (UID: "29e1b1ee-f679-47a5-b568-dea36a0e8530"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:56:11.267539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.267505 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e1b1ee-f679-47a5-b568-dea36a0e8530-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "29e1b1ee-f679-47a5-b568-dea36a0e8530" (UID: "29e1b1ee-f679-47a5-b568-dea36a0e8530"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:56:11.267539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.267523 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-dshm" (OuterVolumeSpecName: "dshm") pod "29e1b1ee-f679-47a5-b568-dea36a0e8530" (UID: "29e1b1ee-f679-47a5-b568-dea36a0e8530"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:56:11.267817 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.267796 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e1b1ee-f679-47a5-b568-dea36a0e8530-kube-api-access-g9sgz" (OuterVolumeSpecName: "kube-api-access-g9sgz") pod "29e1b1ee-f679-47a5-b568-dea36a0e8530" (UID: "29e1b1ee-f679-47a5-b568-dea36a0e8530"). InnerVolumeSpecName "kube-api-access-g9sgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:56:11.280488 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.280465 2572 scope.go:117] "RemoveContainer" containerID="fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e" Apr 22 17:56:11.281023 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:56:11.280999 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e\": container with ID starting with fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e not found: ID does not exist" containerID="fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e" Apr 22 17:56:11.281116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.281032 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e"} err="failed to get container status \"fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e\": rpc error: code = NotFound desc = could not find container \"fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e\": container with ID starting with fbee7f121117f0344ae67b74c15b8d018c9a96d36b3a254c6f9c48e65d62739e not found: ID does not exist" Apr 22 17:56:11.281116 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.281052 2572 scope.go:117] "RemoveContainer" containerID="92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d" Apr 22 17:56:11.281377 ip-10-0-135-36 kubenswrapper[2572]: E0422 17:56:11.281358 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d\": container with ID starting with 92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d not found: ID does not exist" containerID="92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d" Apr 22 17:56:11.281445 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.281384 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d"} err="failed to get container status \"92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d\": rpc error: code = NotFound desc = could not find container \"92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d\": container with ID starting with 92e254a98a33b15daef9cc7289d5c30bf383da830d55457bec48d32b858bb16d not found: ID does not exist" Apr 22 17:56:11.328134 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.328091 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "29e1b1ee-f679-47a5-b568-dea36a0e8530" (UID: "29e1b1ee-f679-47a5-b568-dea36a0e8530"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:56:11.366345 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.366313 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:56:11.366345 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.366343 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:56:11.366559 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.366355 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9sgz\" (UniqueName: \"kubernetes.io/projected/29e1b1ee-f679-47a5-b568-dea36a0e8530-kube-api-access-g9sgz\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:56:11.366559 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.366365 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:56:11.366559 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.366374 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/29e1b1ee-f679-47a5-b568-dea36a0e8530-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:56:11.366559 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.366383 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/29e1b1ee-f679-47a5-b568-dea36a0e8530-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:56:11.523172 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.523135 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs"] Apr 22 17:56:11.527442 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:11.527416 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-869cf6fb9f-p6pjs"] Apr 22 17:56:13.098840 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:13.098804 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" path="/var/lib/kubelet/pods/29e1b1ee-f679-47a5-b568-dea36a0e8530/volumes" Apr 22 17:56:18.810284 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:18.810233 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:56:28.810595 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:28.810549 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:56:38.810744 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:38.810685 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:56:48.810077 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:48.809983 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 22 17:56:58.820325 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:58.820292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:56:58.828546 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:56:58.828519 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:57:07.707517 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.707481 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78"] Apr 22 17:57:07.708066 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708047 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" Apr 22 17:57:07.708155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708069 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" Apr 22 17:57:07.708155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708083 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" Apr 22 17:57:07.708155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708091 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" Apr 22 17:57:07.708155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708101 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="storage-initializer" Apr 22 17:57:07.708155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708110 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="storage-initializer" Apr 22 17:57:07.708155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708135 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="storage-initializer" Apr 22 17:57:07.708155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708143 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="storage-initializer" Apr 22 17:57:07.708513 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708287 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="234f3d9e-ee85-48fb-8c8b-4296725b4b43" containerName="main" Apr 22 17:57:07.708513 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.708301 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="29e1b1ee-f679-47a5-b568-dea36a0e8530" containerName="main" Apr 22 17:57:07.711888 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.711867 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.714121 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.714095 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 17:57:07.714233 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.714153 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-qndhq\"" Apr 22 17:57:07.724269 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.724242 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78"] Apr 22 17:57:07.735073 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.735031 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7"] Apr 22 17:57:07.739356 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.739335 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.749985 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.749957 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7"] Apr 22 17:57:07.767667 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767644 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc546ea-5b75-484f-9eb1-8fa7ede15350-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.767837 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-dshm\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.767837 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.767837 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldrxx\" (UniqueName: \"kubernetes.io/projected/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kube-api-access-ldrxx\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.767837 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767816 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.768016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-home\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.768016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd998102-5c6d-4cad-9e90-90fe37d10a40-tls-certs\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.768016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.768016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhh4h\" (UniqueName: \"kubernetes.io/projected/bd998102-5c6d-4cad-9e90-90fe37d10a40-kube-api-access-nhh4h\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.768016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.767993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-home\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.768016 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.768009 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-model-cache\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.768272 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.768101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.869314 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-home\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.869314 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-model-cache\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.869589 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869369 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.869589 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc546ea-5b75-484f-9eb1-8fa7ede15350-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.869589 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-dshm\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.869589 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.869589 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldrxx\" (UniqueName: \"kubernetes.io/projected/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kube-api-access-ldrxx\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.869897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.869897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-home\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.869897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd998102-5c6d-4cad-9e90-90fe37d10a40-tls-certs\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.869897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869776 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-home\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.869897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.869897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhh4h\" (UniqueName: \"kubernetes.io/projected/bd998102-5c6d-4cad-9e90-90fe37d10a40-kube-api-access-nhh4h\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.869897 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.869827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-model-cache\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.870252 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.870012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.870252 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.870140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.870252 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.870158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-home\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.870252 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.870212 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.872000 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.871979 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-dshm\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.872278 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.872260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.872415 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.872394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd998102-5c6d-4cad-9e90-90fe37d10a40-tls-certs\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.872455 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.872436 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc546ea-5b75-484f-9eb1-8fa7ede15350-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:07.879523 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.879494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhh4h\" (UniqueName: \"kubernetes.io/projected/bd998102-5c6d-4cad-9e90-90fe37d10a40-kube-api-access-nhh4h\") pod \"router-with-refs-pd-test-kserve-55688d4795-7rb78\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:07.879725 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:07.879683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldrxx\" (UniqueName: \"kubernetes.io/projected/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kube-api-access-ldrxx\") pod \"router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:08.021195 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:08.021105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:08.052992 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:08.052952 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:08.182014 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:08.181985 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78"] Apr 22 17:57:08.182790 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:57:08.182758 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd998102_5c6d_4cad_9e90_90fe37d10a40.slice/crio-923248b48a159bffbc8d7e1c696b4a96cad63b57ec1b4de33692f8da77069307 WatchSource:0}: Error finding container 923248b48a159bffbc8d7e1c696b4a96cad63b57ec1b4de33692f8da77069307: Status 404 returned error can't find the container with id 923248b48a159bffbc8d7e1c696b4a96cad63b57ec1b4de33692f8da77069307 Apr 22 17:57:08.201200 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:08.201166 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7"] Apr 22 17:57:08.203567 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:57:08.203541 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc546ea_5b75_484f_9eb1_8fa7ede15350.slice/crio-9f5b8da003650feac84b4c014c4f64319e42308a44c3165850425a4c0e323c97 WatchSource:0}: Error finding container 9f5b8da003650feac84b4c014c4f64319e42308a44c3165850425a4c0e323c97: Status 404 returned error can't find the container with id 9f5b8da003650feac84b4c014c4f64319e42308a44c3165850425a4c0e323c97 Apr 22 17:57:08.432264 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:08.432228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" event={"ID":"cbc546ea-5b75-484f-9eb1-8fa7ede15350","Type":"ContainerStarted","Data":"8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484"} Apr 22 17:57:08.432476 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:08.432271 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" event={"ID":"cbc546ea-5b75-484f-9eb1-8fa7ede15350","Type":"ContainerStarted","Data":"9f5b8da003650feac84b4c014c4f64319e42308a44c3165850425a4c0e323c97"} Apr 22 17:57:08.433431 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:08.433397 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" event={"ID":"bd998102-5c6d-4cad-9e90-90fe37d10a40","Type":"ContainerStarted","Data":"923248b48a159bffbc8d7e1c696b4a96cad63b57ec1b4de33692f8da77069307"} Apr 22 17:57:09.440266 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:09.440217 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" event={"ID":"bd998102-5c6d-4cad-9e90-90fe37d10a40","Type":"ContainerStarted","Data":"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52"} Apr 22 17:57:09.440762 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:09.440595 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:10.445508 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:10.445468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" event={"ID":"bd998102-5c6d-4cad-9e90-90fe37d10a40","Type":"ContainerStarted","Data":"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa"} Apr 22 17:57:12.457216 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:12.457180 2572 generic.go:358] "Generic (PLEG): container finished" podID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerID="8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484" exitCode=0 Apr 22 17:57:12.457711 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:12.457256 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" event={"ID":"cbc546ea-5b75-484f-9eb1-8fa7ede15350","Type":"ContainerDied","Data":"8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484"} Apr 22 17:57:13.464465 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:13.464424 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" event={"ID":"cbc546ea-5b75-484f-9eb1-8fa7ede15350","Type":"ContainerStarted","Data":"b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3"} Apr 22 17:57:13.486437 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:13.486369 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podStartSLOduration=6.486347 podStartE2EDuration="6.486347s" podCreationTimestamp="2026-04-22 17:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:57:13.484712142 +0000 UTC m=+1366.963968820" watchObservedRunningTime="2026-04-22 17:57:13.486347 +0000 UTC m=+1366.965603683" Apr 22 17:57:14.470555 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:14.470524 2572 generic.go:358] "Generic (PLEG): container finished" podID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerID="52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa" exitCode=0 Apr 22 17:57:14.470965 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:14.470593 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" event={"ID":"bd998102-5c6d-4cad-9e90-90fe37d10a40","Type":"ContainerDied","Data":"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa"} Apr 22 17:57:15.477024 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:15.476983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" event={"ID":"bd998102-5c6d-4cad-9e90-90fe37d10a40","Type":"ContainerStarted","Data":"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86"} Apr 22 17:57:15.502884 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:15.502818 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podStartSLOduration=7.658108456 podStartE2EDuration="8.502798249s" podCreationTimestamp="2026-04-22 17:57:07 +0000 UTC" firstStartedPulling="2026-04-22 17:57:08.184947128 +0000 UTC m=+1361.664203787" lastFinishedPulling="2026-04-22 17:57:09.029636917 +0000 UTC m=+1362.508893580" observedRunningTime="2026-04-22 17:57:15.49789579 +0000 UTC m=+1368.977152473" watchObservedRunningTime="2026-04-22 17:57:15.502798249 +0000 UTC m=+1368.982054931" Apr 22 17:57:18.021979 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:18.021935 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:18.021979 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:18.021983 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:18.023432 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:18.023403 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:57:18.053347 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:18.053307 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:18.053544 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:18.053487 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 17:57:18.055035 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:18.055000 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:57:19.391213 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:19.391173 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4"] Apr 22 17:57:19.391622 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:19.391564 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" containerID="cri-o://4add97eb8dcf5aa4ba6a025482cb7c336f41715620576256296b4c5b47785627" gracePeriod=30 Apr 22 17:57:28.022525 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:28.022475 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:57:28.039712 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:28.039670 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:57:28.054124 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:28.054085 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:57:30.535451 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.535399 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc"] Apr 22 17:57:30.570067 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.570039 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc"] Apr 22 17:57:30.570235 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.570159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.572811 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.572786 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 17:57:30.705135 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.705103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bgtc\" (UniqueName: \"kubernetes.io/projected/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kube-api-access-2bgtc\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.705336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.705157 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f4883-47e8-4b60-8cb9-cfa478a35411-tls-certs\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.705336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.705204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-home\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.705336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.705220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-dshm\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.705336 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.705268 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kserve-provision-location\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.705590 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.705348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-model-cache\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806013 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.805919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bgtc\" (UniqueName: \"kubernetes.io/projected/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kube-api-access-2bgtc\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806013 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.805986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f4883-47e8-4b60-8cb9-cfa478a35411-tls-certs\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806320 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.806033 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-home\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806320 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.806056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-dshm\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806320 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.806093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kserve-provision-location\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806320 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.806136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-model-cache\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806566 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.806527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-home\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806629 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.806572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-model-cache\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.806759 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.806689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kserve-provision-location\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.808526 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.808505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-dshm\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.809053 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.809027 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f4883-47e8-4b60-8cb9-cfa478a35411-tls-certs\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.819539 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.819507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bgtc\" (UniqueName: \"kubernetes.io/projected/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kube-api-access-2bgtc\") pod \"router-with-refs-test-kserve-7845bc4d6c-4zbgc\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:30.883717 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:30.883656 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:31.080933 ip-10-0-135-36 kubenswrapper[2572]: W0422 17:57:31.080890 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca9f4883_47e8_4b60_8cb9_cfa478a35411.slice/crio-02e0250265972e33de72c7b759f3a1e6f91f9b91ae2bcebf69683dc07faa05d2 WatchSource:0}: Error finding container 02e0250265972e33de72c7b759f3a1e6f91f9b91ae2bcebf69683dc07faa05d2: Status 404 returned error can't find the container with id 02e0250265972e33de72c7b759f3a1e6f91f9b91ae2bcebf69683dc07faa05d2 Apr 22 17:57:31.088084 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:31.088057 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc"] Apr 22 17:57:31.546280 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:31.546235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" event={"ID":"ca9f4883-47e8-4b60-8cb9-cfa478a35411","Type":"ContainerStarted","Data":"ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af"} Apr 22 17:57:31.546280 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:31.546284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" event={"ID":"ca9f4883-47e8-4b60-8cb9-cfa478a35411","Type":"ContainerStarted","Data":"02e0250265972e33de72c7b759f3a1e6f91f9b91ae2bcebf69683dc07faa05d2"} Apr 22 17:57:36.571380 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:36.571340 2572 generic.go:358] "Generic (PLEG): container finished" podID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerID="ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af" exitCode=0 Apr 22 17:57:36.572010 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:36.571408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" event={"ID":"ca9f4883-47e8-4b60-8cb9-cfa478a35411","Type":"ContainerDied","Data":"ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af"} Apr 22 17:57:37.579535 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:37.579494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" event={"ID":"ca9f4883-47e8-4b60-8cb9-cfa478a35411","Type":"ContainerStarted","Data":"834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8"} Apr 22 17:57:37.603093 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:37.603028 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podStartSLOduration=7.603007335 podStartE2EDuration="7.603007335s" podCreationTimestamp="2026-04-22 17:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:57:37.601475629 +0000 UTC m=+1391.080732433" watchObservedRunningTime="2026-04-22 17:57:37.603007335 +0000 UTC m=+1391.082264020" Apr 22 17:57:38.022105 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:38.022057 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:57:38.053787 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:38.053743 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:57:40.884363 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:40.884319 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:40.884841 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:40.884485 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 17:57:40.886225 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:40.886185 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:57:48.021960 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:48.021903 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:57:48.053989 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:48.053937 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:57:49.645286 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.645199 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-66d46d49f5-gcwg4_ca75965a-862d-492d-b120-f7f935c93947/main/0.log" Apr 22 17:57:49.645767 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.645628 2572 generic.go:358] "Generic (PLEG): container finished" podID="ca75965a-862d-492d-b120-f7f935c93947" containerID="4add97eb8dcf5aa4ba6a025482cb7c336f41715620576256296b4c5b47785627" exitCode=137 Apr 22 17:57:49.645767 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.645753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" event={"ID":"ca75965a-862d-492d-b120-f7f935c93947","Type":"ContainerDied","Data":"4add97eb8dcf5aa4ba6a025482cb7c336f41715620576256296b4c5b47785627"} Apr 22 17:57:49.714011 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.713979 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-66d46d49f5-gcwg4_ca75965a-862d-492d-b120-f7f935c93947/main/0.log" Apr 22 17:57:49.714422 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.714400 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:57:49.804641 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.804543 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-kserve-provision-location\") pod \"ca75965a-862d-492d-b120-f7f935c93947\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " Apr 22 17:57:49.804641 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.804625 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-model-cache\") pod \"ca75965a-862d-492d-b120-f7f935c93947\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " Apr 22 17:57:49.804919 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.804670 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-dshm\") pod \"ca75965a-862d-492d-b120-f7f935c93947\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " Apr 22 17:57:49.804919 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.804770 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q69nn\" (UniqueName: \"kubernetes.io/projected/ca75965a-862d-492d-b120-f7f935c93947-kube-api-access-q69nn\") pod \"ca75965a-862d-492d-b120-f7f935c93947\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " Apr 22 17:57:49.804919 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.804818 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca75965a-862d-492d-b120-f7f935c93947-tls-certs\") pod \"ca75965a-862d-492d-b120-f7f935c93947\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " Apr 22 17:57:49.804919 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.804851 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-home\") pod \"ca75965a-862d-492d-b120-f7f935c93947\" (UID: \"ca75965a-862d-492d-b120-f7f935c93947\") " Apr 22 17:57:49.805138 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.804986 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-model-cache" (OuterVolumeSpecName: "model-cache") pod "ca75965a-862d-492d-b120-f7f935c93947" (UID: "ca75965a-862d-492d-b120-f7f935c93947"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:49.805206 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.805181 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:57:49.805369 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.805297 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-home" (OuterVolumeSpecName: "home") pod "ca75965a-862d-492d-b120-f7f935c93947" (UID: "ca75965a-862d-492d-b120-f7f935c93947"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:49.816652 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.816613 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-dshm" (OuterVolumeSpecName: "dshm") pod "ca75965a-862d-492d-b120-f7f935c93947" (UID: "ca75965a-862d-492d-b120-f7f935c93947"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:49.816840 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.816782 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca75965a-862d-492d-b120-f7f935c93947-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ca75965a-862d-492d-b120-f7f935c93947" (UID: "ca75965a-862d-492d-b120-f7f935c93947"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:49.816896 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.816838 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca75965a-862d-492d-b120-f7f935c93947-kube-api-access-q69nn" (OuterVolumeSpecName: "kube-api-access-q69nn") pod "ca75965a-862d-492d-b120-f7f935c93947" (UID: "ca75965a-862d-492d-b120-f7f935c93947"). InnerVolumeSpecName "kube-api-access-q69nn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:49.890800 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.890730 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca75965a-862d-492d-b120-f7f935c93947" (UID: "ca75965a-862d-492d-b120-f7f935c93947"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:49.906407 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.906317 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:57:49.906407 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.906355 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:57:49.906407 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.906370 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca75965a-862d-492d-b120-f7f935c93947-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:57:49.906407 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.906387 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q69nn\" (UniqueName: \"kubernetes.io/projected/ca75965a-862d-492d-b120-f7f935c93947-kube-api-access-q69nn\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:57:49.906407 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:49.906401 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca75965a-862d-492d-b120-f7f935c93947-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 17:57:50.652625 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:50.652579 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-66d46d49f5-gcwg4_ca75965a-862d-492d-b120-f7f935c93947/main/0.log" Apr 22 17:57:50.653152 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:50.653123 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" event={"ID":"ca75965a-862d-492d-b120-f7f935c93947","Type":"ContainerDied","Data":"a406543275b1dd722ac9088b27cc3eda6dde84b82cdc792e47c79fa91c00a656"} Apr 22 17:57:50.653224 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:50.653180 2572 scope.go:117] "RemoveContainer" containerID="4add97eb8dcf5aa4ba6a025482cb7c336f41715620576256296b4c5b47785627" Apr 22 17:57:50.653224 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:50.653133 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4" Apr 22 17:57:50.682923 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:50.682880 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4"] Apr 22 17:57:50.685871 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:50.685842 2572 scope.go:117] "RemoveContainer" containerID="ac1b89a508f30711364e5a9d4b748525fcd54a837bc3455349e41502f6e03cbb" Apr 22 17:57:50.688021 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:50.687999 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-66d46d49f5-gcwg4"] Apr 22 17:57:50.884360 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:50.884317 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:57:51.111176 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:51.111090 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca75965a-862d-492d-b120-f7f935c93947" path="/var/lib/kubelet/pods/ca75965a-862d-492d-b120-f7f935c93947/volumes" Apr 22 17:57:58.021988 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:58.021937 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:57:58.054000 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:57:58.053952 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:58:00.884765 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:00.884718 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:58:08.021628 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:08.021566 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:58:08.054068 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:08.054025 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:58:10.884246 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:10.884180 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:58:18.021750 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:18.021633 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:58:18.053387 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:18.053336 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:58:20.884830 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:20.884777 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:58:28.022174 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:28.022118 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:58:28.053502 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:28.053447 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:58:30.884218 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:30.884170 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:58:38.022549 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:38.022493 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:58:38.053693 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:38.053656 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:58:40.884655 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:40.884610 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:58:48.022190 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:48.022140 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:58:48.054147 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:48.054101 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:58:50.885112 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:50.885062 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:58:58.022307 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:58.022256 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:58:58.054392 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:58:58.054347 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:59:00.884905 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:00.884865 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:59:08.022473 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:08.022430 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:59:08.053825 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:08.053773 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:59:10.884936 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:10.884881 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:59:18.021741 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:18.021682 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:59:18.054155 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:18.054115 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:59:20.884472 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:20.884423 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:59:27.143077 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:27.143047 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:59:27.149551 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:27.149518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 17:59:28.021876 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:28.021830 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:59:28.054263 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:28.054223 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:59:30.884825 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:30.884777 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:59:38.022457 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:38.022393 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:59:38.053835 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:38.053798 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:59:40.884816 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:40.884763 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:59:48.022206 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:48.022107 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 22 17:59:48.053435 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:48.053388 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 17:59:50.884948 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:50.884899 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 17:59:58.031880 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:58.031837 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:59:58.044792 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:58.044763 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 17:59:58.054069 ip-10-0-135-36 kubenswrapper[2572]: I0422 17:59:58.054037 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 18:00:00.884521 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:00.884481 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 18:00:08.054247 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:08.054197 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 22 18:00:10.884382 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:10.884326 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 22 18:00:18.063433 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:18.063398 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 18:00:18.071135 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:18.071108 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 18:00:20.894818 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:20.894785 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 18:00:20.903743 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:20.903710 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 18:00:30.393968 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:30.393925 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc"] Apr 22 18:00:30.397467 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:30.395381 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" containerID="cri-o://834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8" gracePeriod=30 Apr 22 18:00:45.814732 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.814681 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l"] Apr 22 18:00:45.815334 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.815300 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="storage-initializer" Apr 22 18:00:45.815334 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.815315 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="storage-initializer" Apr 22 18:00:45.815334 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.815333 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" Apr 22 18:00:45.815519 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.815341 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" Apr 22 18:00:45.815519 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.815458 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca75965a-862d-492d-b120-f7f935c93947" containerName="main" Apr 22 18:00:45.820863 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.820837 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.821916 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.821895 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf"] Apr 22 18:00:45.824586 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.824564 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 18:00:45.824586 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.824578 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-xqjgh\"" Apr 22 18:00:45.825381 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.825364 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.836560 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.836532 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l"] Apr 22 18:00:45.849839 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.849810 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf"] Apr 22 18:00:45.878902 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.878871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.878902 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.878904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.879098 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.878932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.879098 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.878986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a3c174-2c25-4a96-b6b3-d8060413a9f3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.879098 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.879065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.879098 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.879089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.879250 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.879106 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.879250 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.879162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49tl2\" (UniqueName: \"kubernetes.io/projected/1103e4e1-442f-45fe-beb9-0727f4365396-kube-api-access-49tl2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.879250 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.879183 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1103e4e1-442f-45fe-beb9-0727f4365396-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.879250 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.879198 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.879250 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.879243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfkdz\" (UniqueName: \"kubernetes.io/projected/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kube-api-access-qfkdz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.879471 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.879260 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.980669 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49tl2\" (UniqueName: \"kubernetes.io/projected/1103e4e1-442f-45fe-beb9-0727f4365396-kube-api-access-49tl2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.980876 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1103e4e1-442f-45fe-beb9-0727f4365396-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.980876 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.980876 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfkdz\" (UniqueName: \"kubernetes.io/projected/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kube-api-access-qfkdz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.980876 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.981113 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.981113 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.981113 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.981113 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.980987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a3c174-2c25-4a96-b6b3-d8060413a9f3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.981113 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.981113 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.981113 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.981478 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.981478 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.981478 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.981648 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.981803 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.981982 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.981960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.983253 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.983233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.983421 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.983403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.983812 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.983793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a3c174-2c25-4a96-b6b3-d8060413a9f3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:45.983943 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.983923 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1103e4e1-442f-45fe-beb9-0727f4365396-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.989046 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.989022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49tl2\" (UniqueName: \"kubernetes.io/projected/1103e4e1-442f-45fe-beb9-0727f4365396-kube-api-access-49tl2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:45.989437 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:45.989422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfkdz\" (UniqueName: \"kubernetes.io/projected/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kube-api-access-qfkdz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:46.141514 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.141481 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:46.153394 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.153360 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:46.296405 ip-10-0-135-36 kubenswrapper[2572]: W0422 18:00:46.296372 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1103e4e1_442f_45fe_beb9_0727f4365396.slice/crio-b14b7c0ed64ef30e0f9a93f9d24df469beb11933cd970a17e6f5cbfe2dbc5ff8 WatchSource:0}: Error finding container b14b7c0ed64ef30e0f9a93f9d24df469beb11933cd970a17e6f5cbfe2dbc5ff8: Status 404 returned error can't find the container with id b14b7c0ed64ef30e0f9a93f9d24df469beb11933cd970a17e6f5cbfe2dbc5ff8 Apr 22 18:00:46.296557 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.296526 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l"] Apr 22 18:00:46.298396 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.298378 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:00:46.320240 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.320216 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf"] Apr 22 18:00:46.322052 ip-10-0-135-36 kubenswrapper[2572]: W0422 18:00:46.322027 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89a3c174_2c25_4a96_b6b3_d8060413a9f3.slice/crio-865bcfd4aeab5df9e954099232319db01a39a061f2ae00179234927fa473e72c WatchSource:0}: Error finding container 865bcfd4aeab5df9e954099232319db01a39a061f2ae00179234927fa473e72c: Status 404 returned error can't find the container with id 865bcfd4aeab5df9e954099232319db01a39a061f2ae00179234927fa473e72c Apr 22 18:00:46.401312 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.401215 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" event={"ID":"89a3c174-2c25-4a96-b6b3-d8060413a9f3","Type":"ContainerStarted","Data":"8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b"} Apr 22 18:00:46.401312 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.401263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" event={"ID":"89a3c174-2c25-4a96-b6b3-d8060413a9f3","Type":"ContainerStarted","Data":"865bcfd4aeab5df9e954099232319db01a39a061f2ae00179234927fa473e72c"} Apr 22 18:00:46.402922 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.402893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" event={"ID":"1103e4e1-442f-45fe-beb9-0727f4365396","Type":"ContainerStarted","Data":"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946"} Apr 22 18:00:46.403042 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.402927 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" event={"ID":"1103e4e1-442f-45fe-beb9-0727f4365396","Type":"ContainerStarted","Data":"b14b7c0ed64ef30e0f9a93f9d24df469beb11933cd970a17e6f5cbfe2dbc5ff8"} Apr 22 18:00:46.403042 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:46.402991 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:47.411672 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:47.411630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" event={"ID":"1103e4e1-442f-45fe-beb9-0727f4365396","Type":"ContainerStarted","Data":"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357"} Apr 22 18:00:47.996878 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:47.995499 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78"] Apr 22 18:00:47.996878 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:47.996853 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" containerID="cri-o://50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86" gracePeriod=30 Apr 22 18:00:48.000901 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:48.000853 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7"] Apr 22 18:00:48.001366 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:48.001329 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" containerID="cri-o://b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3" gracePeriod=30 Apr 22 18:00:51.432610 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:51.432570 2572 generic.go:358] "Generic (PLEG): container finished" podID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerID="8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b" exitCode=0 Apr 22 18:00:51.433118 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:51.432644 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" event={"ID":"89a3c174-2c25-4a96-b6b3-d8060413a9f3","Type":"ContainerDied","Data":"8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b"} Apr 22 18:00:51.434349 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:51.434328 2572 generic.go:358] "Generic (PLEG): container finished" podID="1103e4e1-442f-45fe-beb9-0727f4365396" containerID="2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357" exitCode=0 Apr 22 18:00:51.434429 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:51.434393 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" event={"ID":"1103e4e1-442f-45fe-beb9-0727f4365396","Type":"ContainerDied","Data":"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357"} Apr 22 18:00:52.441454 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:52.441414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" event={"ID":"89a3c174-2c25-4a96-b6b3-d8060413a9f3","Type":"ContainerStarted","Data":"fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda"} Apr 22 18:00:52.443668 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:52.443633 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" event={"ID":"1103e4e1-442f-45fe-beb9-0727f4365396","Type":"ContainerStarted","Data":"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832"} Apr 22 18:00:52.481026 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:52.480969 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podStartSLOduration=7.480952398 podStartE2EDuration="7.480952398s" podCreationTimestamp="2026-04-22 18:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:00:52.478874937 +0000 UTC m=+1585.958131643" watchObservedRunningTime="2026-04-22 18:00:52.480952398 +0000 UTC m=+1585.960209079" Apr 22 18:00:52.510872 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:52.510821 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podStartSLOduration=7.51080331 podStartE2EDuration="7.51080331s" podCreationTimestamp="2026-04-22 18:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:00:52.508290255 +0000 UTC m=+1585.987546938" watchObservedRunningTime="2026-04-22 18:00:52.51080331 +0000 UTC m=+1585.990059989" Apr 22 18:00:56.141609 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:56.141554 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:56.141609 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:56.141618 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:56.143009 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:56.142973 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:00:56.154406 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:56.154373 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:56.154592 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:56.154423 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:00:56.155659 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:56.155617 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:00:56.167868 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:56.167840 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:00:58.165823 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.165785 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z"] Apr 22 18:00:58.201482 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.201450 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z"] Apr 22 18:00:58.201759 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.201739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.204236 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.204208 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 22 18:00:58.302101 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.302067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7b47\" (UniqueName: \"kubernetes.io/projected/7383fe24-e41e-4638-9411-2d715139b18b-kube-api-access-t7b47\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.302319 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.302119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7383fe24-e41e-4638-9411-2d715139b18b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.302319 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.302219 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.302319 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.302260 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.302462 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.302357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.302498 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.302471 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.403209 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.403172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.403421 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.403231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.403421 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.403293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.403421 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.403337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7b47\" (UniqueName: \"kubernetes.io/projected/7383fe24-e41e-4638-9411-2d715139b18b-kube-api-access-t7b47\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.403421 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.403374 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7383fe24-e41e-4638-9411-2d715139b18b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.403421 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.403420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.404027 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.404002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.404207 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.404033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.404207 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.404118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.406010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.405988 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7383fe24-e41e-4638-9411-2d715139b18b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.406222 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.406192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.413396 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.413366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7b47\" (UniqueName: \"kubernetes.io/projected/7383fe24-e41e-4638-9411-2d715139b18b-kube-api-access-t7b47\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.514092 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.513987 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:00:58.668924 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:58.668894 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z"] Apr 22 18:00:58.670404 ip-10-0-135-36 kubenswrapper[2572]: W0422 18:00:58.670375 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7383fe24_e41e_4638_9411_2d715139b18b.slice/crio-40bbcc9a2cdde3a7361f6be9f2eecba7a301339ac49973101dd4f912c1811f3d WatchSource:0}: Error finding container 40bbcc9a2cdde3a7361f6be9f2eecba7a301339ac49973101dd4f912c1811f3d: Status 404 returned error can't find the container with id 40bbcc9a2cdde3a7361f6be9f2eecba7a301339ac49973101dd4f912c1811f3d Apr 22 18:00:59.475660 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:59.475620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" event={"ID":"7383fe24-e41e-4638-9411-2d715139b18b","Type":"ContainerStarted","Data":"53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992"} Apr 22 18:00:59.476076 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:00:59.475674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" event={"ID":"7383fe24-e41e-4638-9411-2d715139b18b","Type":"ContainerStarted","Data":"40bbcc9a2cdde3a7361f6be9f2eecba7a301339ac49973101dd4f912c1811f3d"} Apr 22 18:01:00.771458 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.771424 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-7845bc4d6c-4zbgc_ca9f4883-47e8-4b60-8cb9-cfa478a35411/main/0.log" Apr 22 18:01:00.771888 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.771821 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 18:01:00.943664 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.943555 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bgtc\" (UniqueName: \"kubernetes.io/projected/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kube-api-access-2bgtc\") pod \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " Apr 22 18:01:00.943870 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.943673 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kserve-provision-location\") pod \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " Apr 22 18:01:00.943870 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.943734 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-home\") pod \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " Apr 22 18:01:00.943870 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.943770 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-model-cache\") pod \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " Apr 22 18:01:00.943870 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.943868 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f4883-47e8-4b60-8cb9-cfa478a35411-tls-certs\") pod \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " Apr 22 18:01:00.944095 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.943905 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-dshm\") pod \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\" (UID: \"ca9f4883-47e8-4b60-8cb9-cfa478a35411\") " Apr 22 18:01:00.944095 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.944077 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-model-cache" (OuterVolumeSpecName: "model-cache") pod "ca9f4883-47e8-4b60-8cb9-cfa478a35411" (UID: "ca9f4883-47e8-4b60-8cb9-cfa478a35411"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:00.944274 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.944245 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-home" (OuterVolumeSpecName: "home") pod "ca9f4883-47e8-4b60-8cb9-cfa478a35411" (UID: "ca9f4883-47e8-4b60-8cb9-cfa478a35411"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:00.944351 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.944305 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:00.947018 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.946979 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kube-api-access-2bgtc" (OuterVolumeSpecName: "kube-api-access-2bgtc") pod "ca9f4883-47e8-4b60-8cb9-cfa478a35411" (UID: "ca9f4883-47e8-4b60-8cb9-cfa478a35411"). InnerVolumeSpecName "kube-api-access-2bgtc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:01:00.947018 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.946987 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f4883-47e8-4b60-8cb9-cfa478a35411-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ca9f4883-47e8-4b60-8cb9-cfa478a35411" (UID: "ca9f4883-47e8-4b60-8cb9-cfa478a35411"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:01:00.947371 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.947345 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-dshm" (OuterVolumeSpecName: "dshm") pod "ca9f4883-47e8-4b60-8cb9-cfa478a35411" (UID: "ca9f4883-47e8-4b60-8cb9-cfa478a35411"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:00.964531 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:00.964487 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca9f4883-47e8-4b60-8cb9-cfa478a35411" (UID: "ca9f4883-47e8-4b60-8cb9-cfa478a35411"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:01.045430 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.045385 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:01.045430 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.045417 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:01.045430 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.045428 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f4883-47e8-4b60-8cb9-cfa478a35411-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:01.045430 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.045438 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ca9f4883-47e8-4b60-8cb9-cfa478a35411-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:01.045430 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.045447 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2bgtc\" (UniqueName: \"kubernetes.io/projected/ca9f4883-47e8-4b60-8cb9-cfa478a35411-kube-api-access-2bgtc\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:01.487854 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.487825 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-7845bc4d6c-4zbgc_ca9f4883-47e8-4b60-8cb9-cfa478a35411/main/0.log" Apr 22 18:01:01.488290 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.488258 2572 generic.go:358] "Generic (PLEG): container finished" podID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerID="834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8" exitCode=137 Apr 22 18:01:01.488407 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.488312 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" event={"ID":"ca9f4883-47e8-4b60-8cb9-cfa478a35411","Type":"ContainerDied","Data":"834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8"} Apr 22 18:01:01.488407 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.488361 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" Apr 22 18:01:01.488407 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.488381 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc" event={"ID":"ca9f4883-47e8-4b60-8cb9-cfa478a35411","Type":"ContainerDied","Data":"02e0250265972e33de72c7b759f3a1e6f91f9b91ae2bcebf69683dc07faa05d2"} Apr 22 18:01:01.488407 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.488407 2572 scope.go:117] "RemoveContainer" containerID="834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8" Apr 22 18:01:01.515541 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.515514 2572 scope.go:117] "RemoveContainer" containerID="ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af" Apr 22 18:01:01.515720 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.515676 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc"] Apr 22 18:01:01.517008 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.516979 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-7845bc4d6c-4zbgc"] Apr 22 18:01:01.546993 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.546943 2572 scope.go:117] "RemoveContainer" containerID="834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8" Apr 22 18:01:01.547398 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:01:01.547367 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8\": container with ID starting with 834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8 not found: ID does not exist" containerID="834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8" Apr 22 18:01:01.547498 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.547424 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8"} err="failed to get container status \"834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8\": rpc error: code = NotFound desc = could not find container \"834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8\": container with ID starting with 834e6d0e0cdd65f12a9db3fc634c2ada89cedc813e0f0a92e5ef5a512f4e47a8 not found: ID does not exist" Apr 22 18:01:01.547498 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.547453 2572 scope.go:117] "RemoveContainer" containerID="ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af" Apr 22 18:01:01.547853 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:01:01.547832 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af\": container with ID starting with ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af not found: ID does not exist" containerID="ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af" Apr 22 18:01:01.547929 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:01.547865 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af"} err="failed to get container status \"ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af\": rpc error: code = NotFound desc = could not find container \"ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af\": container with ID starting with ac44dffab08392eb9bc97a6c805b1388beb8ec068cb5219fa72ef983d3eb88af not found: ID does not exist" Apr 22 18:01:03.100796 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:03.100753 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" path="/var/lib/kubelet/pods/ca9f4883-47e8-4b60-8cb9-cfa478a35411/volumes" Apr 22 18:01:03.501915 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:03.501875 2572 generic.go:358] "Generic (PLEG): container finished" podID="7383fe24-e41e-4638-9411-2d715139b18b" containerID="53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992" exitCode=0 Apr 22 18:01:03.502204 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:03.501947 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" event={"ID":"7383fe24-e41e-4638-9411-2d715139b18b","Type":"ContainerDied","Data":"53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992"} Apr 22 18:01:04.508933 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:04.508897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" event={"ID":"7383fe24-e41e-4638-9411-2d715139b18b","Type":"ContainerStarted","Data":"827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e"} Apr 22 18:01:04.531762 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:04.531687 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podStartSLOduration=6.5316668589999995 podStartE2EDuration="6.531666859s" podCreationTimestamp="2026-04-22 18:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:01:04.52877562 +0000 UTC m=+1598.008032302" watchObservedRunningTime="2026-04-22 18:01:04.531666859 +0000 UTC m=+1598.010923541" Apr 22 18:01:06.142973 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:06.142898 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:01:06.154182 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:06.154140 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:01:08.515244 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:08.515197 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:01:08.515656 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:08.515262 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:01:08.517205 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:08.517166 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:01:16.142649 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:16.142535 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:01:16.154384 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:16.154337 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:01:17.997342 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:17.997281 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="llm-d-routing-sidecar" containerID="cri-o://dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52" gracePeriod=2 Apr 22 18:01:18.022408 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.022365 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 22 18:01:18.022635 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.022605 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 22 18:01:18.033254 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.033208 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 22 18:01:18.066965 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.066926 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": read tcp 10.134.0.2:38084->10.134.0.55:8000: read: connection reset by peer" Apr 22 18:01:18.424077 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.424044 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 18:01:18.513116 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.513077 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-55688d4795-7rb78_bd998102-5c6d-4cad-9e90-90fe37d10a40/main/0.log" Apr 22 18:01:18.513997 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.513969 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 18:01:18.514515 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.514485 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:01:18.525298 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525267 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-model-cache\") pod \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " Apr 22 18:01:18.525482 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525309 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldrxx\" (UniqueName: \"kubernetes.io/projected/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kube-api-access-ldrxx\") pod \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " Apr 22 18:01:18.525482 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525349 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-dshm\") pod \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " Apr 22 18:01:18.525482 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525395 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc546ea-5b75-484f-9eb1-8fa7ede15350-tls-certs\") pod \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " Apr 22 18:01:18.525482 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525415 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kserve-provision-location\") pod \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " Apr 22 18:01:18.525482 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525466 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-home\") pod \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\" (UID: \"cbc546ea-5b75-484f-9eb1-8fa7ede15350\") " Apr 22 18:01:18.525785 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525639 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-model-cache" (OuterVolumeSpecName: "model-cache") pod "cbc546ea-5b75-484f-9eb1-8fa7ede15350" (UID: "cbc546ea-5b75-484f-9eb1-8fa7ede15350"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:18.525882 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525795 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.526014 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.525990 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-home" (OuterVolumeSpecName: "home") pod "cbc546ea-5b75-484f-9eb1-8fa7ede15350" (UID: "cbc546ea-5b75-484f-9eb1-8fa7ede15350"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:18.528252 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.528214 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-dshm" (OuterVolumeSpecName: "dshm") pod "cbc546ea-5b75-484f-9eb1-8fa7ede15350" (UID: "cbc546ea-5b75-484f-9eb1-8fa7ede15350"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:18.528427 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.528307 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kube-api-access-ldrxx" (OuterVolumeSpecName: "kube-api-access-ldrxx") pod "cbc546ea-5b75-484f-9eb1-8fa7ede15350" (UID: "cbc546ea-5b75-484f-9eb1-8fa7ede15350"). InnerVolumeSpecName "kube-api-access-ldrxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:01:18.528504 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.528477 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc546ea-5b75-484f-9eb1-8fa7ede15350-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cbc546ea-5b75-484f-9eb1-8fa7ede15350" (UID: "cbc546ea-5b75-484f-9eb1-8fa7ede15350"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:01:18.544809 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.544755 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cbc546ea-5b75-484f-9eb1-8fa7ede15350" (UID: "cbc546ea-5b75-484f-9eb1-8fa7ede15350"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:18.584724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.584417 2572 generic.go:358] "Generic (PLEG): container finished" podID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerID="b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3" exitCode=137 Apr 22 18:01:18.584724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.584503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" event={"ID":"cbc546ea-5b75-484f-9eb1-8fa7ede15350","Type":"ContainerDied","Data":"b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3"} Apr 22 18:01:18.584724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.584524 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" Apr 22 18:01:18.584724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.584535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7" event={"ID":"cbc546ea-5b75-484f-9eb1-8fa7ede15350","Type":"ContainerDied","Data":"9f5b8da003650feac84b4c014c4f64319e42308a44c3165850425a4c0e323c97"} Apr 22 18:01:18.584724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.584555 2572 scope.go:117] "RemoveContainer" containerID="b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3" Apr 22 18:01:18.589752 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.586418 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-55688d4795-7rb78_bd998102-5c6d-4cad-9e90-90fe37d10a40/main/0.log" Apr 22 18:01:18.589752 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.587138 2572 generic.go:358] "Generic (PLEG): container finished" podID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerID="50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86" exitCode=137 Apr 22 18:01:18.589752 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.587160 2572 generic.go:358] "Generic (PLEG): container finished" podID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerID="dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52" exitCode=0 Apr 22 18:01:18.589752 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.587258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" event={"ID":"bd998102-5c6d-4cad-9e90-90fe37d10a40","Type":"ContainerDied","Data":"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86"} Apr 22 18:01:18.589752 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.587287 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" event={"ID":"bd998102-5c6d-4cad-9e90-90fe37d10a40","Type":"ContainerDied","Data":"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52"} Apr 22 18:01:18.589752 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.587301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" event={"ID":"bd998102-5c6d-4cad-9e90-90fe37d10a40","Type":"ContainerDied","Data":"923248b48a159bffbc8d7e1c696b4a96cad63b57ec1b4de33692f8da77069307"} Apr 22 18:01:18.589752 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.587408 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78" Apr 22 18:01:18.615809 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.615774 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7"] Apr 22 18:01:18.617883 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.617836 2572 scope.go:117] "RemoveContainer" containerID="8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484" Apr 22 18:01:18.618338 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.618314 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6468dbcdc6-97tt7"] Apr 22 18:01:18.626298 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626267 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-home\") pod \"bd998102-5c6d-4cad-9e90-90fe37d10a40\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " Apr 22 18:01:18.626418 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626380 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-dshm\") pod \"bd998102-5c6d-4cad-9e90-90fe37d10a40\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " Apr 22 18:01:18.626478 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626414 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhh4h\" (UniqueName: \"kubernetes.io/projected/bd998102-5c6d-4cad-9e90-90fe37d10a40-kube-api-access-nhh4h\") pod \"bd998102-5c6d-4cad-9e90-90fe37d10a40\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " Apr 22 18:01:18.626478 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626451 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd998102-5c6d-4cad-9e90-90fe37d10a40-tls-certs\") pod \"bd998102-5c6d-4cad-9e90-90fe37d10a40\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " Apr 22 18:01:18.626592 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626485 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-model-cache\") pod \"bd998102-5c6d-4cad-9e90-90fe37d10a40\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " Apr 22 18:01:18.626592 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626565 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-kserve-provision-location\") pod \"bd998102-5c6d-4cad-9e90-90fe37d10a40\" (UID: \"bd998102-5c6d-4cad-9e90-90fe37d10a40\") " Apr 22 18:01:18.626987 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626854 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.626987 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626878 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc546ea-5b75-484f-9eb1-8fa7ede15350-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.626987 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626895 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.626987 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626908 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cbc546ea-5b75-484f-9eb1-8fa7ede15350-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.626987 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626923 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ldrxx\" (UniqueName: \"kubernetes.io/projected/cbc546ea-5b75-484f-9eb1-8fa7ede15350-kube-api-access-ldrxx\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.626987 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.626947 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-model-cache" (OuterVolumeSpecName: "model-cache") pod "bd998102-5c6d-4cad-9e90-90fe37d10a40" (UID: "bd998102-5c6d-4cad-9e90-90fe37d10a40"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:18.628685 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.628662 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-dshm" (OuterVolumeSpecName: "dshm") pod "bd998102-5c6d-4cad-9e90-90fe37d10a40" (UID: "bd998102-5c6d-4cad-9e90-90fe37d10a40"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:18.629001 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.628982 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd998102-5c6d-4cad-9e90-90fe37d10a40-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bd998102-5c6d-4cad-9e90-90fe37d10a40" (UID: "bd998102-5c6d-4cad-9e90-90fe37d10a40"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:01:18.639811 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.639785 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd998102-5c6d-4cad-9e90-90fe37d10a40-kube-api-access-nhh4h" (OuterVolumeSpecName: "kube-api-access-nhh4h") pod "bd998102-5c6d-4cad-9e90-90fe37d10a40" (UID: "bd998102-5c6d-4cad-9e90-90fe37d10a40"). InnerVolumeSpecName "kube-api-access-nhh4h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:01:18.641019 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.640996 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-home" (OuterVolumeSpecName: "home") pod "bd998102-5c6d-4cad-9e90-90fe37d10a40" (UID: "bd998102-5c6d-4cad-9e90-90fe37d10a40"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:18.657114 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.657086 2572 scope.go:117] "RemoveContainer" containerID="b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3" Apr 22 18:01:18.657451 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:01:18.657429 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3\": container with ID starting with b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3 not found: ID does not exist" containerID="b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3" Apr 22 18:01:18.657550 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.657466 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3"} err="failed to get container status \"b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3\": rpc error: code = NotFound desc = could not find container \"b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3\": container with ID starting with b824f5f8ba2f2aee9d1a63098e02f8de4bcd030e7181d00895347e6c310462b3 not found: ID does not exist" Apr 22 18:01:18.657550 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.657483 2572 scope.go:117] "RemoveContainer" containerID="8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484" Apr 22 18:01:18.657829 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:01:18.657809 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484\": container with ID starting with 8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484 not found: ID does not exist" containerID="8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484" Apr 22 18:01:18.657913 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.657834 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484"} err="failed to get container status \"8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484\": rpc error: code = NotFound desc = could not find container \"8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484\": container with ID starting with 8a7c16cf1b01d8b41ebfd0e5117952632870f3bdd6b022e40091689a621ae484 not found: ID does not exist" Apr 22 18:01:18.657913 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.657853 2572 scope.go:117] "RemoveContainer" containerID="50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86" Apr 22 18:01:18.660038 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.660006 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bd998102-5c6d-4cad-9e90-90fe37d10a40" (UID: "bd998102-5c6d-4cad-9e90-90fe37d10a40"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:18.681966 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.681647 2572 scope.go:117] "RemoveContainer" containerID="52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa" Apr 22 18:01:18.711740 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.711714 2572 scope.go:117] "RemoveContainer" containerID="dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52" Apr 22 18:01:18.722092 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.722064 2572 scope.go:117] "RemoveContainer" containerID="50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86" Apr 22 18:01:18.722431 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:01:18.722401 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86\": container with ID starting with 50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86 not found: ID does not exist" containerID="50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86" Apr 22 18:01:18.722527 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.722441 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86"} err="failed to get container status \"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86\": rpc error: code = NotFound desc = could not find container \"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86\": container with ID starting with 50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86 not found: ID does not exist" Apr 22 18:01:18.722527 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.722460 2572 scope.go:117] "RemoveContainer" containerID="52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa" Apr 22 18:01:18.722794 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:01:18.722764 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa\": container with ID starting with 52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa not found: ID does not exist" containerID="52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa" Apr 22 18:01:18.722863 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.722800 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa"} err="failed to get container status \"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa\": rpc error: code = NotFound desc = could not find container \"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa\": container with ID starting with 52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa not found: ID does not exist" Apr 22 18:01:18.722863 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.722816 2572 scope.go:117] "RemoveContainer" containerID="dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52" Apr 22 18:01:18.723088 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:01:18.723067 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52\": container with ID starting with dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52 not found: ID does not exist" containerID="dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52" Apr 22 18:01:18.723207 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.723092 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52"} err="failed to get container status \"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52\": rpc error: code = NotFound desc = could not find container \"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52\": container with ID starting with dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52 not found: ID does not exist" Apr 22 18:01:18.723207 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.723105 2572 scope.go:117] "RemoveContainer" containerID="50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86" Apr 22 18:01:18.723375 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.723346 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86"} err="failed to get container status \"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86\": rpc error: code = NotFound desc = could not find container \"50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86\": container with ID starting with 50d38536db14b820dfb7d5dda44888b3995af829d7938ad216e8c78d2eccbd86 not found: ID does not exist" Apr 22 18:01:18.723447 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.723377 2572 scope.go:117] "RemoveContainer" containerID="52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa" Apr 22 18:01:18.723676 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.723657 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa"} err="failed to get container status \"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa\": rpc error: code = NotFound desc = could not find container \"52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa\": container with ID starting with 52320edf7479f61faef436accc4bc3921bb4da0ea96d356aaaa87342ff22ffaa not found: ID does not exist" Apr 22 18:01:18.723676 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.723676 2572 scope.go:117] "RemoveContainer" containerID="dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52" Apr 22 18:01:18.723975 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.723944 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52"} err="failed to get container status \"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52\": rpc error: code = NotFound desc = could not find container \"dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52\": container with ID starting with dc559ac89ef34ac475972e412c0988b5c4ef5ecc76f098c86169f2da9d3bdc52 not found: ID does not exist" Apr 22 18:01:18.727720 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.727673 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.727720 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.727720 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.727885 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.727735 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nhh4h\" (UniqueName: \"kubernetes.io/projected/bd998102-5c6d-4cad-9e90-90fe37d10a40-kube-api-access-nhh4h\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.727885 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.727749 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd998102-5c6d-4cad-9e90-90fe37d10a40-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.727885 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.727766 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.727885 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.727783 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd998102-5c6d-4cad-9e90-90fe37d10a40-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:01:18.917106 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.917068 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78"] Apr 22 18:01:18.921047 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:18.921016 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-55688d4795-7rb78"] Apr 22 18:01:19.111052 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:19.111006 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" path="/var/lib/kubelet/pods/bd998102-5c6d-4cad-9e90-90fe37d10a40/volumes" Apr 22 18:01:19.111766 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:19.111745 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" path="/var/lib/kubelet/pods/cbc546ea-5b75-484f-9eb1-8fa7ede15350/volumes" Apr 22 18:01:26.142532 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:26.142478 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:01:26.153832 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:26.153798 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:01:28.515432 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:28.515381 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:01:36.142808 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:36.142749 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:01:36.154637 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:36.154596 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:01:38.514882 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:38.514832 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:01:46.142933 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:46.142872 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:01:46.154379 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:46.154342 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:01:48.515429 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:48.515373 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:01:56.142398 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:56.142344 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:01:56.154376 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:56.154338 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:01:58.514361 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:01:58.514315 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:02:06.142769 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:06.142712 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:02:06.153822 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:06.153779 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:02:08.514633 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:08.514589 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:02:16.142510 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:16.142462 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:02:16.153754 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:16.153722 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:02:18.515182 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:18.515139 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:02:26.142811 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:26.142754 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:02:26.154325 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:26.154289 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:02:28.514633 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:28.514578 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:02:36.142687 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:36.142639 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:02:36.153808 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:36.153769 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:02:38.514373 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:38.514332 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:02:46.142232 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:46.142115 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:02:46.154298 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:46.154260 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:02:48.515075 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:48.515036 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:02:56.142289 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:56.142238 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:02:56.153820 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:56.153787 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:02:58.514438 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:02:58.514389 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:03:06.142481 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:06.142433 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:03:06.153745 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:06.153710 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:03:08.515128 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:08.515081 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:03:16.142346 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:16.142292 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:03:16.153915 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:16.153883 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:03:18.514392 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:18.514349 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:03:26.142592 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:26.142544 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:03:26.154101 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:26.154061 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:03:28.514434 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:28.514393 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:03:36.142351 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:36.142312 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8001/health\": dial tcp 10.134.0.57:8001: connect: connection refused" Apr 22 18:03:36.154045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:36.154010 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.58:8000/health\": dial tcp 10.134.0.58:8000: connect: connection refused" Apr 22 18:03:38.514724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:38.514671 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:03:46.151907 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:46.151875 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:03:46.167032 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:46.166998 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:03:46.173240 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:46.173211 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:03:46.176879 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:46.176845 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:03:48.514909 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:48.514871 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.59:8000/health\": dial tcp 10.134.0.59:8000: connect: connection refused" Apr 22 18:03:58.530963 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:58.530928 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:03:58.546344 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:58.546310 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:03:59.820347 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:59.820312 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l"] Apr 22 18:03:59.821153 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:59.820995 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" containerID="cri-o://c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832" gracePeriod=30 Apr 22 18:03:59.828568 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:59.828541 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf"] Apr 22 18:03:59.828827 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:03:59.828798 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" containerID="cri-o://fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda" gracePeriod=30 Apr 22 18:04:07.891301 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891266 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh"] Apr 22 18:04:07.891877 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891855 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891881 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891905 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="storage-initializer" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891914 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="storage-initializer" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891924 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="storage-initializer" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891932 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="storage-initializer" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891956 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="llm-d-routing-sidecar" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891965 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="llm-d-routing-sidecar" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891979 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.891986 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.892000 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="storage-initializer" Apr 22 18:04:07.892010 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.892008 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="storage-initializer" Apr 22 18:04:07.892569 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.892017 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" Apr 22 18:04:07.892569 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.892024 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" Apr 22 18:04:07.892569 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.892123 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc546ea-5b75-484f-9eb1-8fa7ede15350" containerName="main" Apr 22 18:04:07.892569 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.892142 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="llm-d-routing-sidecar" Apr 22 18:04:07.892569 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.892153 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca9f4883-47e8-4b60-8cb9-cfa478a35411" containerName="main" Apr 22 18:04:07.892569 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.892162 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd998102-5c6d-4cad-9e90-90fe37d10a40" containerName="main" Apr 22 18:04:07.895593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.895573 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:07.897956 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.897929 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-6z6rg\"" Apr 22 18:04:07.898269 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.898249 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 18:04:07.909300 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.909277 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh"] Apr 22 18:04:07.916418 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.916387 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5"] Apr 22 18:04:07.920022 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.919991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:07.929247 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.929223 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5"] Apr 22 18:04:07.931063 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.931040 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-dshm\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:07.931180 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.931092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-home\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:07.931180 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.931127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr98j\" (UniqueName: \"kubernetes.io/projected/263dcd47-767e-4041-899e-f38319e4e1cc-kube-api-access-rr98j\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:07.931180 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.931174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:07.931376 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.931223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:07.931376 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:07.931248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/263dcd47-767e-4041-899e-f38319e4e1cc-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.032373 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-dshm\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.032576 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-home\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.032576 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.032576 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.032576 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr98j\" (UniqueName: \"kubernetes.io/projected/263dcd47-767e-4041-899e-f38319e4e1cc-kube-api-access-rr98j\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.032576 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xvrv\" (UniqueName: \"kubernetes.io/projected/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kube-api-access-7xvrv\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.032576 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.032918 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.032918 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.032918 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7239fd65-c36c-4ce0-94fd-c820738b9e0a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.032918 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.032918 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032780 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/263dcd47-767e-4041-899e-f38319e4e1cc-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.032918 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-home\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.032918 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.033173 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.032993 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.034802 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.034781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-dshm\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.035418 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.035401 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/263dcd47-767e-4041-899e-f38319e4e1cc-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.041254 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.041231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr98j\" (UniqueName: \"kubernetes.io/projected/263dcd47-767e-4041-899e-f38319e4e1cc-kube-api-access-rr98j\") pod \"custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.134226 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7239fd65-c36c-4ce0-94fd-c820738b9e0a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.134379 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.134379 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.134379 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xvrv\" (UniqueName: \"kubernetes.io/projected/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kube-api-access-7xvrv\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.134379 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.134600 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.134756 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.134837 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.134884 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.134828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.136524 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.136502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.136729 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.136712 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7239fd65-c36c-4ce0-94fd-c820738b9e0a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.142613 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.142557 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xvrv\" (UniqueName: \"kubernetes.io/projected/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kube-api-access-7xvrv\") pod \"custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.208338 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.208301 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:08.233250 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.233219 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:08.355757 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.355731 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh"] Apr 22 18:04:08.359363 ip-10-0-135-36 kubenswrapper[2572]: W0422 18:04:08.359331 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod263dcd47_767e_4041_899e_f38319e4e1cc.slice/crio-6d31277d4a6348afaa05896ce872e9867200cd6461652a67ae563ad308d27bfb WatchSource:0}: Error finding container 6d31277d4a6348afaa05896ce872e9867200cd6461652a67ae563ad308d27bfb: Status 404 returned error can't find the container with id 6d31277d4a6348afaa05896ce872e9867200cd6461652a67ae563ad308d27bfb Apr 22 18:04:08.380593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:08.380569 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5"] Apr 22 18:04:08.382000 ip-10-0-135-36 kubenswrapper[2572]: W0422 18:04:08.381978 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7239fd65_c36c_4ce0_94fd_c820738b9e0a.slice/crio-e36c7a488e949467be8a6c7038e9220e2ce9e878634ca9601cb6cbdc6fccbc70 WatchSource:0}: Error finding container e36c7a488e949467be8a6c7038e9220e2ce9e878634ca9601cb6cbdc6fccbc70: Status 404 returned error can't find the container with id e36c7a488e949467be8a6c7038e9220e2ce9e878634ca9601cb6cbdc6fccbc70 Apr 22 18:04:09.330264 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:09.330230 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" event={"ID":"7239fd65-c36c-4ce0-94fd-c820738b9e0a","Type":"ContainerStarted","Data":"53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2"} Apr 22 18:04:09.330264 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:09.330271 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" event={"ID":"7239fd65-c36c-4ce0-94fd-c820738b9e0a","Type":"ContainerStarted","Data":"e36c7a488e949467be8a6c7038e9220e2ce9e878634ca9601cb6cbdc6fccbc70"} Apr 22 18:04:09.331487 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:09.331463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" event={"ID":"263dcd47-767e-4041-899e-f38319e4e1cc","Type":"ContainerStarted","Data":"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7"} Apr 22 18:04:09.331561 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:09.331494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" event={"ID":"263dcd47-767e-4041-899e-f38319e4e1cc","Type":"ContainerStarted","Data":"6d31277d4a6348afaa05896ce872e9867200cd6461652a67ae563ad308d27bfb"} Apr 22 18:04:09.331619 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:09.331601 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:09.947936 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:09.947896 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z"] Apr 22 18:04:09.948321 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:09.948290 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" containerID="cri-o://827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e" gracePeriod=30 Apr 22 18:04:10.347828 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:10.347725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" event={"ID":"263dcd47-767e-4041-899e-f38319e4e1cc","Type":"ContainerStarted","Data":"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2"} Apr 22 18:04:13.367784 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:13.367748 2572 generic.go:358] "Generic (PLEG): container finished" podID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerID="53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2" exitCode=0 Apr 22 18:04:13.368196 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:13.367817 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" event={"ID":"7239fd65-c36c-4ce0-94fd-c820738b9e0a","Type":"ContainerDied","Data":"53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2"} Apr 22 18:04:14.373674 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:14.373641 2572 generic.go:358] "Generic (PLEG): container finished" podID="263dcd47-767e-4041-899e-f38319e4e1cc" containerID="46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2" exitCode=0 Apr 22 18:04:14.374155 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:14.373732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" event={"ID":"263dcd47-767e-4041-899e-f38319e4e1cc","Type":"ContainerDied","Data":"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2"} Apr 22 18:04:14.375622 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:14.375600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" event={"ID":"7239fd65-c36c-4ce0-94fd-c820738b9e0a","Type":"ContainerStarted","Data":"27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba"} Apr 22 18:04:14.414895 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:14.414848 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podStartSLOduration=7.414830775 podStartE2EDuration="7.414830775s" podCreationTimestamp="2026-04-22 18:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:04:14.413606141 +0000 UTC m=+1787.892862833" watchObservedRunningTime="2026-04-22 18:04:14.414830775 +0000 UTC m=+1787.894087456" Apr 22 18:04:15.381891 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:15.381848 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" event={"ID":"263dcd47-767e-4041-899e-f38319e4e1cc","Type":"ContainerStarted","Data":"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36"} Apr 22 18:04:15.404831 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:15.404764 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podStartSLOduration=8.404743219 podStartE2EDuration="8.404743219s" podCreationTimestamp="2026-04-22 18:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:04:15.402994297 +0000 UTC m=+1788.882250976" watchObservedRunningTime="2026-04-22 18:04:15.404743219 +0000 UTC m=+1788.883999883" Apr 22 18:04:18.208943 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.208900 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:18.208943 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.208944 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:18.210413 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.210375 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:04:18.227642 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.227619 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:04:18.233633 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.233604 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:18.233633 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.233635 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:04:18.235399 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.235375 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:04:18.510747 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.510647 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:04:18.544450 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.543152 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:04:18.544450 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.543883 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.546479 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.546451 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 18:04:18.546824 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.546801 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-t5nh2\"" Apr 22 18:04:18.644929 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.644893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.645109 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.644938 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52tkz\" (UniqueName: \"kubernetes.io/projected/05de3727-21e1-409f-bed9-d4b6f3f36806-kube-api-access-52tkz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.645109 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.644973 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.645109 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.645043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.645109 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.645104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.645278 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.645131 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05de3727-21e1-409f-bed9-d4b6f3f36806-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.746302 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.746261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.746476 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.746370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.746476 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.746406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05de3727-21e1-409f-bed9-d4b6f3f36806-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.746476 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.746442 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.746683 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.746652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52tkz\" (UniqueName: \"kubernetes.io/projected/05de3727-21e1-409f-bed9-d4b6f3f36806-kube-api-access-52tkz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.746776 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.746746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.746776 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.746764 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.746875 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.746797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.747110 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.747074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.748599 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.748572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.749215 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.749193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05de3727-21e1-409f-bed9-d4b6f3f36806-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.754991 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.754968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52tkz\" (UniqueName: \"kubernetes.io/projected/05de3727-21e1-409f-bed9-d4b6f3f36806-kube-api-access-52tkz\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:18.859081 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:18.858991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:19.008148 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:19.008118 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:04:19.403853 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:19.403797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05de3727-21e1-409f-bed9-d4b6f3f36806","Type":"ContainerStarted","Data":"8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4"} Apr 22 18:04:19.404343 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:19.403861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05de3727-21e1-409f-bed9-d4b6f3f36806","Type":"ContainerStarted","Data":"5ce0848979fe3ed97e06dfe670ef66e74bf73e9400585f0f01061e616237e46c"} Apr 22 18:04:24.429624 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:24.429588 2572 generic.go:358] "Generic (PLEG): container finished" podID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerID="8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4" exitCode=0 Apr 22 18:04:24.430258 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:24.429666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05de3727-21e1-409f-bed9-d4b6f3f36806","Type":"ContainerDied","Data":"8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4"} Apr 22 18:04:25.436459 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:25.436421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05de3727-21e1-409f-bed9-d4b6f3f36806","Type":"ContainerStarted","Data":"4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24"} Apr 22 18:04:25.459576 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:25.459512 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.459491479 podStartE2EDuration="7.459491479s" podCreationTimestamp="2026-04-22 18:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:04:25.45555799 +0000 UTC m=+1798.934814684" watchObservedRunningTime="2026-04-22 18:04:25.459491479 +0000 UTC m=+1798.938748163" Apr 22 18:04:27.204470 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:27.204428 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 18:04:27.204993 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:27.204436 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 18:04:28.209822 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:28.209772 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:04:28.234111 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:28.234066 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:04:28.859784 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:28.859740 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:28.861190 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:28.861159 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:04:29.821427 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:29.821369 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="llm-d-routing-sidecar" containerID="cri-o://17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946" gracePeriod=2 Apr 22 18:04:30.303952 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.303925 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:04:30.308772 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.308730 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l_1103e4e1-442f-45fe-beb9-0727f4365396/main/0.log" Apr 22 18:04:30.309539 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.309516 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:04:30.462824 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.462790 2572 generic.go:358] "Generic (PLEG): container finished" podID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerID="fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda" exitCode=137 Apr 22 18:04:30.463017 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.462887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" event={"ID":"89a3c174-2c25-4a96-b6b3-d8060413a9f3","Type":"ContainerDied","Data":"fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda"} Apr 22 18:04:30.463017 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.462932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" event={"ID":"89a3c174-2c25-4a96-b6b3-d8060413a9f3","Type":"ContainerDied","Data":"865bcfd4aeab5df9e954099232319db01a39a061f2ae00179234927fa473e72c"} Apr 22 18:04:30.463650 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.463627 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf" Apr 22 18:04:30.464724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.464553 2572 scope.go:117] "RemoveContainer" containerID="fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda" Apr 22 18:04:30.465751 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.465730 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l_1103e4e1-442f-45fe-beb9-0727f4365396/main/0.log" Apr 22 18:04:30.467371 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.467341 2572 generic.go:358] "Generic (PLEG): container finished" podID="1103e4e1-442f-45fe-beb9-0727f4365396" containerID="c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832" exitCode=137 Apr 22 18:04:30.467467 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.467381 2572 generic.go:358] "Generic (PLEG): container finished" podID="1103e4e1-442f-45fe-beb9-0727f4365396" containerID="17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946" exitCode=0 Apr 22 18:04:30.467571 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.467551 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" event={"ID":"1103e4e1-442f-45fe-beb9-0727f4365396","Type":"ContainerDied","Data":"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832"} Apr 22 18:04:30.467627 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.467588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" event={"ID":"1103e4e1-442f-45fe-beb9-0727f4365396","Type":"ContainerDied","Data":"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946"} Apr 22 18:04:30.467627 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.467622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" event={"ID":"1103e4e1-442f-45fe-beb9-0727f4365396","Type":"ContainerDied","Data":"b14b7c0ed64ef30e0f9a93f9d24df469beb11933cd970a17e6f5cbfe2dbc5ff8"} Apr 22 18:04:30.467957 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.467934 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l" Apr 22 18:04:30.470225 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470198 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1103e4e1-442f-45fe-beb9-0727f4365396-tls-certs\") pod \"1103e4e1-442f-45fe-beb9-0727f4365396\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " Apr 22 18:04:30.470322 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470267 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kserve-provision-location\") pod \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " Apr 22 18:04:30.470379 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470365 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-dshm\") pod \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " Apr 22 18:04:30.470433 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470417 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfkdz\" (UniqueName: \"kubernetes.io/projected/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kube-api-access-qfkdz\") pod \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " Apr 22 18:04:30.470483 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470463 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-home\") pod \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " Apr 22 18:04:30.470536 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470509 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-kserve-provision-location\") pod \"1103e4e1-442f-45fe-beb9-0727f4365396\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " Apr 22 18:04:30.470588 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470546 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-home\") pod \"1103e4e1-442f-45fe-beb9-0727f4365396\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " Apr 22 18:04:30.470588 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470598 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-dshm\") pod \"1103e4e1-442f-45fe-beb9-0727f4365396\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " Apr 22 18:04:30.470732 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470628 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-model-cache\") pod \"1103e4e1-442f-45fe-beb9-0727f4365396\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " Apr 22 18:04:30.470797 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470747 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a3c174-2c25-4a96-b6b3-d8060413a9f3-tls-certs\") pod \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " Apr 22 18:04:30.470850 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470822 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-model-cache\") pod \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\" (UID: \"89a3c174-2c25-4a96-b6b3-d8060413a9f3\") " Apr 22 18:04:30.470930 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.470914 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49tl2\" (UniqueName: \"kubernetes.io/projected/1103e4e1-442f-45fe-beb9-0727f4365396-kube-api-access-49tl2\") pod \"1103e4e1-442f-45fe-beb9-0727f4365396\" (UID: \"1103e4e1-442f-45fe-beb9-0727f4365396\") " Apr 22 18:04:30.472949 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.472913 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-model-cache" (OuterVolumeSpecName: "model-cache") pod "1103e4e1-442f-45fe-beb9-0727f4365396" (UID: "1103e4e1-442f-45fe-beb9-0727f4365396"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:30.475507 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.475297 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1103e4e1-442f-45fe-beb9-0727f4365396-kube-api-access-49tl2" (OuterVolumeSpecName: "kube-api-access-49tl2") pod "1103e4e1-442f-45fe-beb9-0727f4365396" (UID: "1103e4e1-442f-45fe-beb9-0727f4365396"). InnerVolumeSpecName "kube-api-access-49tl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:04:30.475507 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.475338 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-home" (OuterVolumeSpecName: "home") pod "1103e4e1-442f-45fe-beb9-0727f4365396" (UID: "1103e4e1-442f-45fe-beb9-0727f4365396"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:30.475976 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.475710 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-model-cache" (OuterVolumeSpecName: "model-cache") pod "89a3c174-2c25-4a96-b6b3-d8060413a9f3" (UID: "89a3c174-2c25-4a96-b6b3-d8060413a9f3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:30.476771 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.476744 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a3c174-2c25-4a96-b6b3-d8060413a9f3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "89a3c174-2c25-4a96-b6b3-d8060413a9f3" (UID: "89a3c174-2c25-4a96-b6b3-d8060413a9f3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:04:30.481885 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.480034 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-home" (OuterVolumeSpecName: "home") pod "89a3c174-2c25-4a96-b6b3-d8060413a9f3" (UID: "89a3c174-2c25-4a96-b6b3-d8060413a9f3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:30.484498 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.483712 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-dshm" (OuterVolumeSpecName: "dshm") pod "1103e4e1-442f-45fe-beb9-0727f4365396" (UID: "1103e4e1-442f-45fe-beb9-0727f4365396"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:30.484498 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.483901 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1103e4e1-442f-45fe-beb9-0727f4365396-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1103e4e1-442f-45fe-beb9-0727f4365396" (UID: "1103e4e1-442f-45fe-beb9-0727f4365396"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:04:30.484670 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.484589 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kube-api-access-qfkdz" (OuterVolumeSpecName: "kube-api-access-qfkdz") pod "89a3c174-2c25-4a96-b6b3-d8060413a9f3" (UID: "89a3c174-2c25-4a96-b6b3-d8060413a9f3"). InnerVolumeSpecName "kube-api-access-qfkdz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:04:30.490057 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.490029 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-dshm" (OuterVolumeSpecName: "dshm") pod "89a3c174-2c25-4a96-b6b3-d8060413a9f3" (UID: "89a3c174-2c25-4a96-b6b3-d8060413a9f3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:30.503443 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.503422 2572 scope.go:117] "RemoveContainer" containerID="8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b" Apr 22 18:04:30.508388 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.508361 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "89a3c174-2c25-4a96-b6b3-d8060413a9f3" (UID: "89a3c174-2c25-4a96-b6b3-d8060413a9f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:30.513005 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.512973 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1103e4e1-442f-45fe-beb9-0727f4365396" (UID: "1103e4e1-442f-45fe-beb9-0727f4365396"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:30.545169 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.545124 2572 scope.go:117] "RemoveContainer" containerID="fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda" Apr 22 18:04:30.545584 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:04:30.545555 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda\": container with ID starting with fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda not found: ID does not exist" containerID="fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda" Apr 22 18:04:30.545724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.545598 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda"} err="failed to get container status \"fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda\": rpc error: code = NotFound desc = could not find container \"fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda\": container with ID starting with fb02236cb6a77ca5c2efb47f4b9ab977c768e1e33b7786cb19c05ac723b46bda not found: ID does not exist" Apr 22 18:04:30.545724 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.545625 2572 scope.go:117] "RemoveContainer" containerID="8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b" Apr 22 18:04:30.546263 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:04:30.546237 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b\": container with ID starting with 8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b not found: ID does not exist" containerID="8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b" Apr 22 18:04:30.546381 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.546265 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b"} err="failed to get container status \"8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b\": rpc error: code = NotFound desc = could not find container \"8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b\": container with ID starting with 8cf5f052987d51aa0197ad6d0ece8896100375838cd191f94d7b25fac7bd9e9b not found: ID does not exist" Apr 22 18:04:30.546381 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.546282 2572 scope.go:117] "RemoveContainer" containerID="c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832" Apr 22 18:04:30.568923 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.568895 2572 scope.go:117] "RemoveContainer" containerID="2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357" Apr 22 18:04:30.572912 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.572889 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.572918 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49tl2\" (UniqueName: \"kubernetes.io/projected/1103e4e1-442f-45fe-beb9-0727f4365396-kube-api-access-49tl2\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.572933 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1103e4e1-442f-45fe-beb9-0727f4365396-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.572946 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.572955 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.572963 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfkdz\" (UniqueName: \"kubernetes.io/projected/89a3c174-2c25-4a96-b6b3-d8060413a9f3-kube-api-access-qfkdz\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.572973 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a3c174-2c25-4a96-b6b3-d8060413a9f3-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.572987 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.573000 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.573015 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.573026 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1103e4e1-442f-45fe-beb9-0727f4365396-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.573045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.573047 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a3c174-2c25-4a96-b6b3-d8060413a9f3-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:30.607053 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.607026 2572 scope.go:117] "RemoveContainer" containerID="17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946" Apr 22 18:04:30.619656 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.618786 2572 scope.go:117] "RemoveContainer" containerID="c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832" Apr 22 18:04:30.619656 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:04:30.619140 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832\": container with ID starting with c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832 not found: ID does not exist" containerID="c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832" Apr 22 18:04:30.619656 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.619177 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832"} err="failed to get container status \"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832\": rpc error: code = NotFound desc = could not find container \"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832\": container with ID starting with c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832 not found: ID does not exist" Apr 22 18:04:30.619656 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.619203 2572 scope.go:117] "RemoveContainer" containerID="2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357" Apr 22 18:04:30.619656 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:04:30.619461 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357\": container with ID starting with 2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357 not found: ID does not exist" containerID="2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357" Apr 22 18:04:30.619656 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.619493 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357"} err="failed to get container status \"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357\": rpc error: code = NotFound desc = could not find container \"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357\": container with ID starting with 2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357 not found: ID does not exist" Apr 22 18:04:30.619656 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.619517 2572 scope.go:117] "RemoveContainer" containerID="17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946" Apr 22 18:04:30.620139 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:04:30.619981 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946\": container with ID starting with 17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946 not found: ID does not exist" containerID="17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946" Apr 22 18:04:30.620139 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.620012 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946"} err="failed to get container status \"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946\": rpc error: code = NotFound desc = could not find container \"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946\": container with ID starting with 17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946 not found: ID does not exist" Apr 22 18:04:30.620139 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.620033 2572 scope.go:117] "RemoveContainer" containerID="c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832" Apr 22 18:04:30.620656 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.620627 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832"} err="failed to get container status \"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832\": rpc error: code = NotFound desc = could not find container \"c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832\": container with ID starting with c3c14de04f2aa2f82e5409555aea8f50061b15235b8b81b2407570b58c5f3832 not found: ID does not exist" Apr 22 18:04:30.620656 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.620656 2572 scope.go:117] "RemoveContainer" containerID="2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357" Apr 22 18:04:30.620971 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.620949 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357"} err="failed to get container status \"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357\": rpc error: code = NotFound desc = could not find container \"2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357\": container with ID starting with 2ee350636e3de5b775ae3a42444556c46149c6ec64110081039eb35885eff357 not found: ID does not exist" Apr 22 18:04:30.621047 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.620973 2572 scope.go:117] "RemoveContainer" containerID="17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946" Apr 22 18:04:30.621268 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.621241 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946"} err="failed to get container status \"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946\": rpc error: code = NotFound desc = could not find container \"17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946\": container with ID starting with 17e7d0d2609487b2a4f6b7af87d80fe6edce4fcd199083754a7ad62e226c1946 not found: ID does not exist" Apr 22 18:04:30.792324 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.792291 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf"] Apr 22 18:04:30.798316 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.798278 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-85qjwpf"] Apr 22 18:04:30.811410 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.811384 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l"] Apr 22 18:04:30.818313 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:30.818287 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-8cd4989d6-4m44l"] Apr 22 18:04:31.099868 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:31.099783 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" path="/var/lib/kubelet/pods/1103e4e1-442f-45fe-beb9-0727f4365396/volumes" Apr 22 18:04:31.100545 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:31.100522 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" path="/var/lib/kubelet/pods/89a3c174-2c25-4a96-b6b3-d8060413a9f3/volumes" Apr 22 18:04:38.209594 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:38.209538 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:04:38.233835 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:38.233792 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:04:38.859675 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:38.859634 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:04:40.274995 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.274966 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z_7383fe24-e41e-4638-9411-2d715139b18b/main/0.log" Apr 22 18:04:40.275439 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.275418 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:04:40.367211 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.367165 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7b47\" (UniqueName: \"kubernetes.io/projected/7383fe24-e41e-4638-9411-2d715139b18b-kube-api-access-t7b47\") pod \"7383fe24-e41e-4638-9411-2d715139b18b\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " Apr 22 18:04:40.367408 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.367288 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7383fe24-e41e-4638-9411-2d715139b18b-tls-certs\") pod \"7383fe24-e41e-4638-9411-2d715139b18b\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " Apr 22 18:04:40.367408 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.367318 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-home\") pod \"7383fe24-e41e-4638-9411-2d715139b18b\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " Apr 22 18:04:40.367408 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.367372 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-kserve-provision-location\") pod \"7383fe24-e41e-4638-9411-2d715139b18b\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " Apr 22 18:04:40.367408 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.367400 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-dshm\") pod \"7383fe24-e41e-4638-9411-2d715139b18b\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " Apr 22 18:04:40.367649 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.367451 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-model-cache\") pod \"7383fe24-e41e-4638-9411-2d715139b18b\" (UID: \"7383fe24-e41e-4638-9411-2d715139b18b\") " Apr 22 18:04:40.367824 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.367788 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-model-cache" (OuterVolumeSpecName: "model-cache") pod "7383fe24-e41e-4638-9411-2d715139b18b" (UID: "7383fe24-e41e-4638-9411-2d715139b18b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:40.368444 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.368355 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-home" (OuterVolumeSpecName: "home") pod "7383fe24-e41e-4638-9411-2d715139b18b" (UID: "7383fe24-e41e-4638-9411-2d715139b18b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:40.368875 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.368855 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:40.368993 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.368981 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:40.369836 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.369802 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7383fe24-e41e-4638-9411-2d715139b18b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7383fe24-e41e-4638-9411-2d715139b18b" (UID: "7383fe24-e41e-4638-9411-2d715139b18b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:04:40.369953 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.369880 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7383fe24-e41e-4638-9411-2d715139b18b-kube-api-access-t7b47" (OuterVolumeSpecName: "kube-api-access-t7b47") pod "7383fe24-e41e-4638-9411-2d715139b18b" (UID: "7383fe24-e41e-4638-9411-2d715139b18b"). InnerVolumeSpecName "kube-api-access-t7b47". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:04:40.370261 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.370234 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-dshm" (OuterVolumeSpecName: "dshm") pod "7383fe24-e41e-4638-9411-2d715139b18b" (UID: "7383fe24-e41e-4638-9411-2d715139b18b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:40.377407 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.377370 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7383fe24-e41e-4638-9411-2d715139b18b" (UID: "7383fe24-e41e-4638-9411-2d715139b18b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:40.470329 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.470294 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t7b47\" (UniqueName: \"kubernetes.io/projected/7383fe24-e41e-4638-9411-2d715139b18b-kube-api-access-t7b47\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:40.470329 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.470323 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7383fe24-e41e-4638-9411-2d715139b18b-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:40.470329 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.470336 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:40.470329 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.470344 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7383fe24-e41e-4638-9411-2d715139b18b-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:04:40.518863 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.518764 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z_7383fe24-e41e-4638-9411-2d715139b18b/main/0.log" Apr 22 18:04:40.519132 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.519102 2572 generic.go:358] "Generic (PLEG): container finished" podID="7383fe24-e41e-4638-9411-2d715139b18b" containerID="827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e" exitCode=137 Apr 22 18:04:40.519276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.519199 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" Apr 22 18:04:40.519276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.519227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" event={"ID":"7383fe24-e41e-4638-9411-2d715139b18b","Type":"ContainerDied","Data":"827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e"} Apr 22 18:04:40.519276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.519268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z" event={"ID":"7383fe24-e41e-4638-9411-2d715139b18b","Type":"ContainerDied","Data":"40bbcc9a2cdde3a7361f6be9f2eecba7a301339ac49973101dd4f912c1811f3d"} Apr 22 18:04:40.519450 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.519284 2572 scope.go:117] "RemoveContainer" containerID="827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e" Apr 22 18:04:40.543867 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.543845 2572 scope.go:117] "RemoveContainer" containerID="53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992" Apr 22 18:04:40.550365 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.550337 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z"] Apr 22 18:04:40.558474 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.558446 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6bdc8d656fb2n4z"] Apr 22 18:04:40.573410 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.573384 2572 scope.go:117] "RemoveContainer" containerID="827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e" Apr 22 18:04:40.573985 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:04:40.573946 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e\": container with ID starting with 827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e not found: ID does not exist" containerID="827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e" Apr 22 18:04:40.574099 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.573996 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e"} err="failed to get container status \"827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e\": rpc error: code = NotFound desc = could not find container \"827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e\": container with ID starting with 827d5d751aa1275557e8b18efdc5dfc76315c8c7b67a019b2c732100680d899e not found: ID does not exist" Apr 22 18:04:40.574099 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.574027 2572 scope.go:117] "RemoveContainer" containerID="53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992" Apr 22 18:04:40.574322 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:04:40.574300 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992\": container with ID starting with 53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992 not found: ID does not exist" containerID="53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992" Apr 22 18:04:40.574388 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:40.574333 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992"} err="failed to get container status \"53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992\": rpc error: code = NotFound desc = could not find container \"53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992\": container with ID starting with 53aafabda902d0c95bd683848a7be2c2ba316d1dd9120b6d5c880a11d724d992 not found: ID does not exist" Apr 22 18:04:41.099868 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:41.099833 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7383fe24-e41e-4638-9411-2d715139b18b" path="/var/lib/kubelet/pods/7383fe24-e41e-4638-9411-2d715139b18b/volumes" Apr 22 18:04:48.209119 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:48.209072 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:04:48.234672 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:48.234626 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:04:48.860095 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:48.860051 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:04:48.860505 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:48.860471 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:04:58.209817 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:58.209757 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:04:58.234345 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:58.234289 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:04:58.859542 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:04:58.859497 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:05:08.209901 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:08.209835 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:05:08.234790 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:08.234717 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:05:08.860439 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:08.860394 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:05:18.209164 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:18.209109 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:05:18.234136 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:18.234099 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:05:18.859771 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:18.859729 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:05:28.209609 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:28.209557 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:05:28.233987 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:28.233945 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:05:28.860161 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:28.860118 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:05:38.209102 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:38.209054 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:05:38.234552 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:38.234513 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:05:38.860421 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:38.860379 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:05:48.209554 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:48.209437 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:05:48.234493 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:48.234450 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:05:48.860356 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:48.860307 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:05:58.209080 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:58.209024 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:05:58.233683 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:58.233646 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:05:58.859685 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:05:58.859640 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:06:08.208984 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:08.208926 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:06:08.233853 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:08.233797 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:06:08.860302 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:08.860257 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:06:18.209338 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:18.209283 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:06:18.234238 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:18.234202 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:06:18.860067 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:18.860027 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:06:28.209766 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:28.209720 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:06:28.234617 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:28.234568 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:06:28.860096 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:28.860052 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:06:38.209366 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:38.209313 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:06:38.233641 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:38.233598 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:06:38.859439 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:38.859391 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:06:48.208959 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:48.208922 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:06:48.234589 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:48.234551 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:06:48.860380 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:48.860335 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:06:58.208946 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:58.208899 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 22 18:06:58.234180 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:58.234140 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 22 18:06:58.859476 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:06:58.859438 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:07:08.218865 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:08.218829 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:07:08.231785 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:08.231751 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:07:08.243960 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:08.243922 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:07:08.253774 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:08.253745 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:07:08.859623 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:08.859582 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 22 18:07:18.869581 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:18.869554 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:07:18.877752 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:18.877726 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:07:21.397239 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:21.397205 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh"] Apr 22 18:07:21.398255 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:21.398202 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" containerID="cri-o://5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36" gracePeriod=30 Apr 22 18:07:21.401497 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:21.401473 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5"] Apr 22 18:07:21.401878 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:21.401827 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" containerID="cri-o://27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba" gracePeriod=30 Apr 22 18:07:31.010310 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.010274 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:07:31.010715 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.010629 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" containerID="cri-o://4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24" gracePeriod=30 Apr 22 18:07:31.888837 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.888815 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:07:31.998413 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.998333 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-dshm\") pod \"05de3727-21e1-409f-bed9-d4b6f3f36806\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " Apr 22 18:07:31.998413 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.998400 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-kserve-provision-location\") pod \"05de3727-21e1-409f-bed9-d4b6f3f36806\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " Apr 22 18:07:31.998628 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.998474 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52tkz\" (UniqueName: \"kubernetes.io/projected/05de3727-21e1-409f-bed9-d4b6f3f36806-kube-api-access-52tkz\") pod \"05de3727-21e1-409f-bed9-d4b6f3f36806\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " Apr 22 18:07:31.998628 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.998498 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05de3727-21e1-409f-bed9-d4b6f3f36806-tls-certs\") pod \"05de3727-21e1-409f-bed9-d4b6f3f36806\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " Apr 22 18:07:31.998628 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.998526 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-home\") pod \"05de3727-21e1-409f-bed9-d4b6f3f36806\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " Apr 22 18:07:31.998628 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.998560 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-model-cache\") pod \"05de3727-21e1-409f-bed9-d4b6f3f36806\" (UID: \"05de3727-21e1-409f-bed9-d4b6f3f36806\") " Apr 22 18:07:31.998922 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.998893 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-model-cache" (OuterVolumeSpecName: "model-cache") pod "05de3727-21e1-409f-bed9-d4b6f3f36806" (UID: "05de3727-21e1-409f-bed9-d4b6f3f36806"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:31.999057 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:31.999028 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-home" (OuterVolumeSpecName: "home") pod "05de3727-21e1-409f-bed9-d4b6f3f36806" (UID: "05de3727-21e1-409f-bed9-d4b6f3f36806"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:32.000524 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.000492 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-dshm" (OuterVolumeSpecName: "dshm") pod "05de3727-21e1-409f-bed9-d4b6f3f36806" (UID: "05de3727-21e1-409f-bed9-d4b6f3f36806"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:32.000738 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.000711 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05de3727-21e1-409f-bed9-d4b6f3f36806-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "05de3727-21e1-409f-bed9-d4b6f3f36806" (UID: "05de3727-21e1-409f-bed9-d4b6f3f36806"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:07:32.000831 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.000738 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05de3727-21e1-409f-bed9-d4b6f3f36806-kube-api-access-52tkz" (OuterVolumeSpecName: "kube-api-access-52tkz") pod "05de3727-21e1-409f-bed9-d4b6f3f36806" (UID: "05de3727-21e1-409f-bed9-d4b6f3f36806"). InnerVolumeSpecName "kube-api-access-52tkz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:07:32.057111 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.057058 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "05de3727-21e1-409f-bed9-d4b6f3f36806" (UID: "05de3727-21e1-409f-bed9-d4b6f3f36806"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:32.099821 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.099789 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52tkz\" (UniqueName: \"kubernetes.io/projected/05de3727-21e1-409f-bed9-d4b6f3f36806-kube-api-access-52tkz\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.099821 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.099820 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05de3727-21e1-409f-bed9-d4b6f3f36806-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.099821 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.099829 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.099821 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.099838 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.100066 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.099846 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.100066 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.099854 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05de3727-21e1-409f-bed9-d4b6f3f36806-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.232341 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.232309 2572 generic.go:358] "Generic (PLEG): container finished" podID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerID="4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24" exitCode=0 Apr 22 18:07:32.232518 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.232377 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:07:32.232518 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.232378 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05de3727-21e1-409f-bed9-d4b6f3f36806","Type":"ContainerDied","Data":"4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24"} Apr 22 18:07:32.232518 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.232481 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05de3727-21e1-409f-bed9-d4b6f3f36806","Type":"ContainerDied","Data":"5ce0848979fe3ed97e06dfe670ef66e74bf73e9400585f0f01061e616237e46c"} Apr 22 18:07:32.232518 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.232497 2572 scope.go:117] "RemoveContainer" containerID="4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24" Apr 22 18:07:32.254900 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.254824 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:07:32.255133 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.255113 2572 scope.go:117] "RemoveContainer" containerID="8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4" Apr 22 18:07:32.257353 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.257332 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:07:32.265478 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.265458 2572 scope.go:117] "RemoveContainer" containerID="4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24" Apr 22 18:07:32.265769 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:07:32.265747 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24\": container with ID starting with 4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24 not found: ID does not exist" containerID="4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24" Apr 22 18:07:32.265850 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.265777 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24"} err="failed to get container status \"4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24\": rpc error: code = NotFound desc = could not find container \"4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24\": container with ID starting with 4d45686ed33489a78f2257ceb70bb8496cee56052f4fdef71e96c3cc27375f24 not found: ID does not exist" Apr 22 18:07:32.265850 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.265793 2572 scope.go:117] "RemoveContainer" containerID="8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4" Apr 22 18:07:32.266029 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:07:32.266009 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4\": container with ID starting with 8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4 not found: ID does not exist" containerID="8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4" Apr 22 18:07:32.266098 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:32.266037 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4"} err="failed to get container status \"8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4\": rpc error: code = NotFound desc = could not find container \"8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4\": container with ID starting with 8dec719f86f2c777d27927cee8ad16eb289fcf4781072d09ec0b811a7ba267a4 not found: ID does not exist" Apr 22 18:07:33.099074 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:33.099042 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" path="/var/lib/kubelet/pods/05de3727-21e1-409f-bed9-d4b6f3f36806/volumes" Apr 22 18:07:51.398713 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.398628 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="llm-d-routing-sidecar" containerID="cri-o://6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7" gracePeriod=2 Apr 22 18:07:51.692455 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.692433 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh_263dcd47-767e-4041-899e-f38319e4e1cc/main/0.log" Apr 22 18:07:51.693059 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.693042 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:07:51.695743 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.695725 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:07:51.778044 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778013 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kserve-provision-location\") pod \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " Apr 22 18:07:51.778220 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778058 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr98j\" (UniqueName: \"kubernetes.io/projected/263dcd47-767e-4041-899e-f38319e4e1cc-kube-api-access-rr98j\") pod \"263dcd47-767e-4041-899e-f38319e4e1cc\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " Apr 22 18:07:51.778220 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778074 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-model-cache\") pod \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " Apr 22 18:07:51.778220 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778097 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-home\") pod \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " Apr 22 18:07:51.778220 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778134 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-model-cache\") pod \"263dcd47-767e-4041-899e-f38319e4e1cc\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " Apr 22 18:07:51.778220 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778180 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xvrv\" (UniqueName: \"kubernetes.io/projected/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kube-api-access-7xvrv\") pod \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " Apr 22 18:07:51.778475 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778230 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-kserve-provision-location\") pod \"263dcd47-767e-4041-899e-f38319e4e1cc\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " Apr 22 18:07:51.778475 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778267 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-dshm\") pod \"263dcd47-767e-4041-899e-f38319e4e1cc\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " Apr 22 18:07:51.778475 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778303 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-home\") pod \"263dcd47-767e-4041-899e-f38319e4e1cc\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " Apr 22 18:07:51.778475 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778333 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/263dcd47-767e-4041-899e-f38319e4e1cc-tls-certs\") pod \"263dcd47-767e-4041-899e-f38319e4e1cc\" (UID: \"263dcd47-767e-4041-899e-f38319e4e1cc\") " Apr 22 18:07:51.778475 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778354 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-model-cache" (OuterVolumeSpecName: "model-cache") pod "7239fd65-c36c-4ce0-94fd-c820738b9e0a" (UID: "7239fd65-c36c-4ce0-94fd-c820738b9e0a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:51.778475 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778378 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7239fd65-c36c-4ce0-94fd-c820738b9e0a-tls-certs\") pod \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " Apr 22 18:07:51.778475 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778358 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-model-cache" (OuterVolumeSpecName: "model-cache") pod "263dcd47-767e-4041-899e-f38319e4e1cc" (UID: "263dcd47-767e-4041-899e-f38319e4e1cc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:51.778475 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778404 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-dshm\") pod \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\" (UID: \"7239fd65-c36c-4ce0-94fd-c820738b9e0a\") " Apr 22 18:07:51.778889 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778505 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-home" (OuterVolumeSpecName: "home") pod "7239fd65-c36c-4ce0-94fd-c820738b9e0a" (UID: "7239fd65-c36c-4ce0-94fd-c820738b9e0a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:51.778889 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778823 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.778889 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778843 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.778889 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.778858 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-model-cache\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.779380 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.779114 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-home" (OuterVolumeSpecName: "home") pod "263dcd47-767e-4041-899e-f38319e4e1cc" (UID: "263dcd47-767e-4041-899e-f38319e4e1cc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:51.780937 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.780911 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263dcd47-767e-4041-899e-f38319e4e1cc-kube-api-access-rr98j" (OuterVolumeSpecName: "kube-api-access-rr98j") pod "263dcd47-767e-4041-899e-f38319e4e1cc" (UID: "263dcd47-767e-4041-899e-f38319e4e1cc"). InnerVolumeSpecName "kube-api-access-rr98j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:07:51.781043 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.780936 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-dshm" (OuterVolumeSpecName: "dshm") pod "263dcd47-767e-4041-899e-f38319e4e1cc" (UID: "263dcd47-767e-4041-899e-f38319e4e1cc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:51.781043 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.781013 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-dshm" (OuterVolumeSpecName: "dshm") pod "7239fd65-c36c-4ce0-94fd-c820738b9e0a" (UID: "7239fd65-c36c-4ce0-94fd-c820738b9e0a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:51.781329 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.781307 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263dcd47-767e-4041-899e-f38319e4e1cc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "263dcd47-767e-4041-899e-f38319e4e1cc" (UID: "263dcd47-767e-4041-899e-f38319e4e1cc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:07:51.781405 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.781370 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kube-api-access-7xvrv" (OuterVolumeSpecName: "kube-api-access-7xvrv") pod "7239fd65-c36c-4ce0-94fd-c820738b9e0a" (UID: "7239fd65-c36c-4ce0-94fd-c820738b9e0a"). InnerVolumeSpecName "kube-api-access-7xvrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:07:51.781623 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.781609 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7239fd65-c36c-4ce0-94fd-c820738b9e0a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7239fd65-c36c-4ce0-94fd-c820738b9e0a" (UID: "7239fd65-c36c-4ce0-94fd-c820738b9e0a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:07:51.841190 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.841118 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7239fd65-c36c-4ce0-94fd-c820738b9e0a" (UID: "7239fd65-c36c-4ce0-94fd-c820738b9e0a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:51.841347 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.841202 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "263dcd47-767e-4041-899e-f38319e4e1cc" (UID: "263dcd47-767e-4041-899e-f38319e4e1cc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:51.879424 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879392 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xvrv\" (UniqueName: \"kubernetes.io/projected/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kube-api-access-7xvrv\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.879424 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879423 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.879593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879434 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.879593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879445 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/263dcd47-767e-4041-899e-f38319e4e1cc-home\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.879593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879453 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/263dcd47-767e-4041-899e-f38319e4e1cc-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.879593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879462 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7239fd65-c36c-4ce0-94fd-c820738b9e0a-tls-certs\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.879593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879470 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-dshm\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.879593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879478 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7239fd65-c36c-4ce0-94fd-c820738b9e0a-kserve-provision-location\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:51.879593 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:51.879488 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rr98j\" (UniqueName: \"kubernetes.io/projected/263dcd47-767e-4041-899e-f38319e4e1cc-kube-api-access-rr98j\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:07:52.315340 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.315301 2572 generic.go:358] "Generic (PLEG): container finished" podID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerID="27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba" exitCode=137 Apr 22 18:07:52.315540 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.315379 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" Apr 22 18:07:52.315540 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.315425 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" event={"ID":"7239fd65-c36c-4ce0-94fd-c820738b9e0a","Type":"ContainerDied","Data":"27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba"} Apr 22 18:07:52.315540 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.315463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5" event={"ID":"7239fd65-c36c-4ce0-94fd-c820738b9e0a","Type":"ContainerDied","Data":"e36c7a488e949467be8a6c7038e9220e2ce9e878634ca9601cb6cbdc6fccbc70"} Apr 22 18:07:52.315540 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.315486 2572 scope.go:117] "RemoveContainer" containerID="27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba" Apr 22 18:07:52.316757 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.316741 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh_263dcd47-767e-4041-899e-f38319e4e1cc/main/0.log" Apr 22 18:07:52.317353 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.317333 2572 generic.go:358] "Generic (PLEG): container finished" podID="263dcd47-767e-4041-899e-f38319e4e1cc" containerID="5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36" exitCode=137 Apr 22 18:07:52.317353 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.317353 2572 generic.go:358] "Generic (PLEG): container finished" podID="263dcd47-767e-4041-899e-f38319e4e1cc" containerID="6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7" exitCode=0 Apr 22 18:07:52.317512 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.317410 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" Apr 22 18:07:52.317512 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.317453 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" event={"ID":"263dcd47-767e-4041-899e-f38319e4e1cc","Type":"ContainerDied","Data":"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36"} Apr 22 18:07:52.317512 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.317472 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" event={"ID":"263dcd47-767e-4041-899e-f38319e4e1cc","Type":"ContainerDied","Data":"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7"} Apr 22 18:07:52.317512 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.317482 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh" event={"ID":"263dcd47-767e-4041-899e-f38319e4e1cc","Type":"ContainerDied","Data":"6d31277d4a6348afaa05896ce872e9867200cd6461652a67ae563ad308d27bfb"} Apr 22 18:07:52.340804 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.340772 2572 scope.go:117] "RemoveContainer" containerID="53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2" Apr 22 18:07:52.343026 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.343005 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5"] Apr 22 18:07:52.346938 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.346920 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-f9d6cf688-jqwp5"] Apr 22 18:07:52.352034 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.352015 2572 scope.go:117] "RemoveContainer" containerID="27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba" Apr 22 18:07:52.352337 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:07:52.352314 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba\": container with ID starting with 27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba not found: ID does not exist" containerID="27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba" Apr 22 18:07:52.352419 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.352344 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba"} err="failed to get container status \"27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba\": rpc error: code = NotFound desc = could not find container \"27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba\": container with ID starting with 27f7d68c35134e8e80a129f7be7610a6798fbceeea8cc2958f1473a5e91953ba not found: ID does not exist" Apr 22 18:07:52.352419 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.352363 2572 scope.go:117] "RemoveContainer" containerID="53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2" Apr 22 18:07:52.352624 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:07:52.352606 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2\": container with ID starting with 53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2 not found: ID does not exist" containerID="53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2" Apr 22 18:07:52.352675 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.352629 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2"} err="failed to get container status \"53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2\": rpc error: code = NotFound desc = could not find container \"53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2\": container with ID starting with 53a30c66802d65632542989a387a804db9d63a1e16bc7b7779cb856a479a8fb2 not found: ID does not exist" Apr 22 18:07:52.352675 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.352649 2572 scope.go:117] "RemoveContainer" containerID="5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36" Apr 22 18:07:52.358227 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.358205 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh"] Apr 22 18:07:52.363073 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.363055 2572 scope.go:117] "RemoveContainer" containerID="46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2" Apr 22 18:07:52.364279 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.364258 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bbf8b68c5-s4rwh"] Apr 22 18:07:52.372913 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.372898 2572 scope.go:117] "RemoveContainer" containerID="6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7" Apr 22 18:07:52.381065 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.381036 2572 scope.go:117] "RemoveContainer" containerID="5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36" Apr 22 18:07:52.381294 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:07:52.381276 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36\": container with ID starting with 5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36 not found: ID does not exist" containerID="5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36" Apr 22 18:07:52.381360 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.381301 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36"} err="failed to get container status \"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36\": rpc error: code = NotFound desc = could not find container \"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36\": container with ID starting with 5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36 not found: ID does not exist" Apr 22 18:07:52.381360 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.381326 2572 scope.go:117] "RemoveContainer" containerID="46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2" Apr 22 18:07:52.381565 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:07:52.381546 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2\": container with ID starting with 46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2 not found: ID does not exist" containerID="46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2" Apr 22 18:07:52.381631 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.381573 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2"} err="failed to get container status \"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2\": rpc error: code = NotFound desc = could not find container \"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2\": container with ID starting with 46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2 not found: ID does not exist" Apr 22 18:07:52.381631 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.381595 2572 scope.go:117] "RemoveContainer" containerID="6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7" Apr 22 18:07:52.381847 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:07:52.381830 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7\": container with ID starting with 6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7 not found: ID does not exist" containerID="6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7" Apr 22 18:07:52.381916 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.381854 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7"} err="failed to get container status \"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7\": rpc error: code = NotFound desc = could not find container \"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7\": container with ID starting with 6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7 not found: ID does not exist" Apr 22 18:07:52.381916 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.381872 2572 scope.go:117] "RemoveContainer" containerID="5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36" Apr 22 18:07:52.382077 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.382055 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36"} err="failed to get container status \"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36\": rpc error: code = NotFound desc = could not find container \"5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36\": container with ID starting with 5cfd1c4c335767bc9b402e1a85ea24c0d15ce50fc181f248ced13652cc08ee36 not found: ID does not exist" Apr 22 18:07:52.382121 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.382078 2572 scope.go:117] "RemoveContainer" containerID="46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2" Apr 22 18:07:52.382294 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.382271 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2"} err="failed to get container status \"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2\": rpc error: code = NotFound desc = could not find container \"46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2\": container with ID starting with 46c22bfdfa5eaf80e709700c3140f7e1b4584e80cee2a8e71f53622e6acd2dd2 not found: ID does not exist" Apr 22 18:07:52.382294 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.382293 2572 scope.go:117] "RemoveContainer" containerID="6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7" Apr 22 18:07:52.382519 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:52.382500 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7"} err="failed to get container status \"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7\": rpc error: code = NotFound desc = could not find container \"6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7\": container with ID starting with 6909dec76e19385a82149a94ee584092507af61789b128efb69f9a91cbc4d1c7 not found: ID does not exist" Apr 22 18:07:53.099229 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:53.099196 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" path="/var/lib/kubelet/pods/263dcd47-767e-4041-899e-f38319e4e1cc/volumes" Apr 22 18:07:53.099661 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:07:53.099648 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" path="/var/lib/kubelet/pods/7239fd65-c36c-4ce0-94fd-c820738b9e0a/volumes" Apr 22 18:08:04.215802 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.215767 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2bv9h/must-gather-4fsmt"] Apr 22 18:08:04.216201 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216182 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" Apr 22 18:08:04.216201 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216194 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216209 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="llm-d-routing-sidecar" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216218 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="llm-d-routing-sidecar" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216228 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216234 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216242 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="storage-initializer" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216247 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="storage-initializer" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216252 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="storage-initializer" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216258 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="storage-initializer" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216264 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216268 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216275 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" Apr 22 18:08:04.216276 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216281 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216297 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216303 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216309 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="storage-initializer" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216314 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="storage-initializer" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216319 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="storage-initializer" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216324 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="storage-initializer" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216330 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216335 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216343 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="storage-initializer" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216348 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="storage-initializer" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216356 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="llm-d-routing-sidecar" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216360 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="llm-d-routing-sidecar" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216373 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="storage-initializer" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216380 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="storage-initializer" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216436 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216450 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1103e4e1-442f-45fe-beb9-0727f4365396" containerName="llm-d-routing-sidecar" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216460 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7383fe24-e41e-4638-9411-2d715139b18b" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216467 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="llm-d-routing-sidecar" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216474 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="89a3c174-2c25-4a96-b6b3-d8060413a9f3" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216481 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7239fd65-c36c-4ce0-94fd-c820738b9e0a" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216488 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="263dcd47-767e-4041-899e-f38319e4e1cc" containerName="main" Apr 22 18:08:04.216636 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.216493 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="05de3727-21e1-409f-bed9-d4b6f3f36806" containerName="main" Apr 22 18:08:04.218547 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.218532 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:04.222362 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.222344 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2bv9h\"/\"kube-root-ca.crt\"" Apr 22 18:08:04.223044 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.223030 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2bv9h\"/\"default-dockercfg-h9hck\"" Apr 22 18:08:04.223098 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.223077 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2bv9h\"/\"openshift-service-ca.crt\"" Apr 22 18:08:04.236145 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.236113 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2bv9h/must-gather-4fsmt"] Apr 22 18:08:04.388348 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.388317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57b6v\" (UniqueName: \"kubernetes.io/projected/b7cd397d-66df-4554-97ab-826f049f3fcc-kube-api-access-57b6v\") pod \"must-gather-4fsmt\" (UID: \"b7cd397d-66df-4554-97ab-826f049f3fcc\") " pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:04.388511 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.388376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7cd397d-66df-4554-97ab-826f049f3fcc-must-gather-output\") pod \"must-gather-4fsmt\" (UID: \"b7cd397d-66df-4554-97ab-826f049f3fcc\") " pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:04.489747 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.489625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57b6v\" (UniqueName: \"kubernetes.io/projected/b7cd397d-66df-4554-97ab-826f049f3fcc-kube-api-access-57b6v\") pod \"must-gather-4fsmt\" (UID: \"b7cd397d-66df-4554-97ab-826f049f3fcc\") " pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:04.489747 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.489674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7cd397d-66df-4554-97ab-826f049f3fcc-must-gather-output\") pod \"must-gather-4fsmt\" (UID: \"b7cd397d-66df-4554-97ab-826f049f3fcc\") " pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:04.490008 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.489989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7cd397d-66df-4554-97ab-826f049f3fcc-must-gather-output\") pod \"must-gather-4fsmt\" (UID: \"b7cd397d-66df-4554-97ab-826f049f3fcc\") " pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:04.506662 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.506635 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57b6v\" (UniqueName: \"kubernetes.io/projected/b7cd397d-66df-4554-97ab-826f049f3fcc-kube-api-access-57b6v\") pod \"must-gather-4fsmt\" (UID: \"b7cd397d-66df-4554-97ab-826f049f3fcc\") " pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:04.527467 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.527439 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:04.652349 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.652306 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2bv9h/must-gather-4fsmt"] Apr 22 18:08:04.654623 ip-10-0-135-36 kubenswrapper[2572]: W0422 18:08:04.654598 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7cd397d_66df_4554_97ab_826f049f3fcc.slice/crio-aa43709e8f83f3ad1d4082a1d1b96f4e6dfe29efc23c5f3be448c2a713a569d3 WatchSource:0}: Error finding container aa43709e8f83f3ad1d4082a1d1b96f4e6dfe29efc23c5f3be448c2a713a569d3: Status 404 returned error can't find the container with id aa43709e8f83f3ad1d4082a1d1b96f4e6dfe29efc23c5f3be448c2a713a569d3 Apr 22 18:08:04.656352 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:04.656334 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:08:05.371404 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:05.371366 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" event={"ID":"b7cd397d-66df-4554-97ab-826f049f3fcc","Type":"ContainerStarted","Data":"aa43709e8f83f3ad1d4082a1d1b96f4e6dfe29efc23c5f3be448c2a713a569d3"} Apr 22 18:08:10.396126 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:10.396086 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" event={"ID":"b7cd397d-66df-4554-97ab-826f049f3fcc","Type":"ContainerStarted","Data":"19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6"} Apr 22 18:08:10.396126 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:10.396131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" event={"ID":"b7cd397d-66df-4554-97ab-826f049f3fcc","Type":"ContainerStarted","Data":"7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb"} Apr 22 18:08:10.413727 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:10.413659 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" podStartSLOduration=1.5709312739999999 podStartE2EDuration="6.41364169s" podCreationTimestamp="2026-04-22 18:08:04 +0000 UTC" firstStartedPulling="2026-04-22 18:08:04.6564876 +0000 UTC m=+2018.135744259" lastFinishedPulling="2026-04-22 18:08:09.499198016 +0000 UTC m=+2022.978454675" observedRunningTime="2026-04-22 18:08:10.412482884 +0000 UTC m=+2023.891739565" watchObservedRunningTime="2026-04-22 18:08:10.41364169 +0000 UTC m=+2023.892898373" Apr 22 18:08:33.405897 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:33.405860 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-drzkg_08478191-02d1-4e5b-bb2b-84c1fa06481d/limitador/0.log" Apr 22 18:08:34.501683 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:34.501652 2572 generic.go:358] "Generic (PLEG): container finished" podID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerID="7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb" exitCode=0 Apr 22 18:08:34.502144 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:34.501734 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" event={"ID":"b7cd397d-66df-4554-97ab-826f049f3fcc","Type":"ContainerDied","Data":"7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb"} Apr 22 18:08:34.502144 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:34.502047 2572 scope.go:117] "RemoveContainer" containerID="7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb" Apr 22 18:08:35.047332 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.047298 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2bv9h_must-gather-4fsmt_b7cd397d-66df-4554-97ab-826f049f3fcc/gather/0.log" Apr 22 18:08:35.661501 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.661466 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h8t2w/must-gather-xq6zx"] Apr 22 18:08:35.664672 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.664651 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8t2w/must-gather-xq6zx" Apr 22 18:08:35.666974 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.666951 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h8t2w\"/\"openshift-service-ca.crt\"" Apr 22 18:08:35.667848 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.667828 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h8t2w\"/\"default-dockercfg-r2vhp\"" Apr 22 18:08:35.667848 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.667840 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h8t2w\"/\"kube-root-ca.crt\"" Apr 22 18:08:35.670894 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.670866 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8t2w/must-gather-xq6zx"] Apr 22 18:08:35.782625 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.782594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkzx4\" (UniqueName: \"kubernetes.io/projected/1664a214-8701-43e7-948d-b5a20dec0d27-kube-api-access-xkzx4\") pod \"must-gather-xq6zx\" (UID: \"1664a214-8701-43e7-948d-b5a20dec0d27\") " pod="openshift-must-gather-h8t2w/must-gather-xq6zx" Apr 22 18:08:35.782812 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.782659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1664a214-8701-43e7-948d-b5a20dec0d27-must-gather-output\") pod \"must-gather-xq6zx\" (UID: \"1664a214-8701-43e7-948d-b5a20dec0d27\") " pod="openshift-must-gather-h8t2w/must-gather-xq6zx" Apr 22 18:08:35.883363 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.883336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkzx4\" (UniqueName: \"kubernetes.io/projected/1664a214-8701-43e7-948d-b5a20dec0d27-kube-api-access-xkzx4\") pod \"must-gather-xq6zx\" (UID: \"1664a214-8701-43e7-948d-b5a20dec0d27\") " pod="openshift-must-gather-h8t2w/must-gather-xq6zx" Apr 22 18:08:35.883501 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.883377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1664a214-8701-43e7-948d-b5a20dec0d27-must-gather-output\") pod \"must-gather-xq6zx\" (UID: \"1664a214-8701-43e7-948d-b5a20dec0d27\") " pod="openshift-must-gather-h8t2w/must-gather-xq6zx" Apr 22 18:08:35.883639 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.883625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1664a214-8701-43e7-948d-b5a20dec0d27-must-gather-output\") pod \"must-gather-xq6zx\" (UID: \"1664a214-8701-43e7-948d-b5a20dec0d27\") " pod="openshift-must-gather-h8t2w/must-gather-xq6zx" Apr 22 18:08:35.891067 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.891052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkzx4\" (UniqueName: \"kubernetes.io/projected/1664a214-8701-43e7-948d-b5a20dec0d27-kube-api-access-xkzx4\") pod \"must-gather-xq6zx\" (UID: \"1664a214-8701-43e7-948d-b5a20dec0d27\") " pod="openshift-must-gather-h8t2w/must-gather-xq6zx" Apr 22 18:08:35.974345 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:35.974263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8t2w/must-gather-xq6zx" Apr 22 18:08:36.093286 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:36.093260 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8t2w/must-gather-xq6zx"] Apr 22 18:08:36.094826 ip-10-0-135-36 kubenswrapper[2572]: W0422 18:08:36.094800 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1664a214_8701_43e7_948d_b5a20dec0d27.slice/crio-a6e183271c414e39fb05a1f59030fe51879805963555894476a41141595713e7 WatchSource:0}: Error finding container a6e183271c414e39fb05a1f59030fe51879805963555894476a41141595713e7: Status 404 returned error can't find the container with id a6e183271c414e39fb05a1f59030fe51879805963555894476a41141595713e7 Apr 22 18:08:36.513065 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:36.513024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/must-gather-xq6zx" event={"ID":"1664a214-8701-43e7-948d-b5a20dec0d27","Type":"ContainerStarted","Data":"a6e183271c414e39fb05a1f59030fe51879805963555894476a41141595713e7"} Apr 22 18:08:37.520248 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:37.520146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/must-gather-xq6zx" event={"ID":"1664a214-8701-43e7-948d-b5a20dec0d27","Type":"ContainerStarted","Data":"9a31338a350e3b8a3c67502352e75f8971eb17b2bf2a383881ca004bb8f28728"} Apr 22 18:08:37.520248 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:37.520206 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/must-gather-xq6zx" event={"ID":"1664a214-8701-43e7-948d-b5a20dec0d27","Type":"ContainerStarted","Data":"c39ae9529d1e77e07b675b25bc2b75828d441025199f3f37f8ba4e29517fffa3"} Apr 22 18:08:37.540190 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:37.540112 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h8t2w/must-gather-xq6zx" podStartSLOduration=1.7165221339999999 podStartE2EDuration="2.540092803s" podCreationTimestamp="2026-04-22 18:08:35 +0000 UTC" firstStartedPulling="2026-04-22 18:08:36.096650315 +0000 UTC m=+2049.575906975" lastFinishedPulling="2026-04-22 18:08:36.92022098 +0000 UTC m=+2050.399477644" observedRunningTime="2026-04-22 18:08:37.536903733 +0000 UTC m=+2051.016160416" watchObservedRunningTime="2026-04-22 18:08:37.540092803 +0000 UTC m=+2051.019349485" Apr 22 18:08:38.649920 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:38.649886 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tsg64_fb37d3e6-7be5-4eaa-8699-e8a8a641b235/global-pull-secret-syncer/0.log" Apr 22 18:08:38.741339 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:38.741306 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8n7x2_7c07b1eb-ef24-4289-8216-10e5782c6173/konnectivity-agent/0.log" Apr 22 18:08:38.820045 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:38.820013 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-36.ec2.internal_ec7cb24e845e46d881c2a9f07f361da0/haproxy/0.log" Apr 22 18:08:40.516787 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:40.516746 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2bv9h/must-gather-4fsmt"] Apr 22 18:08:40.517255 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:40.517043 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerName="copy" containerID="cri-o://19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6" gracePeriod=2 Apr 22 18:08:40.519111 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:40.519074 2572 status_manager.go:895] "Failed to get status for pod" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" err="pods \"must-gather-4fsmt\" is forbidden: User \"system:node:ip-10-0-135-36.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-2bv9h\": no relationship found between node 'ip-10-0-135-36.ec2.internal' and this object" Apr 22 18:08:40.519692 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:40.519661 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2bv9h/must-gather-4fsmt"] Apr 22 18:08:40.923067 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:40.923038 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2bv9h_must-gather-4fsmt_b7cd397d-66df-4554-97ab-826f049f3fcc/copy/0.log" Apr 22 18:08:40.923580 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:40.923558 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:40.926685 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:40.926639 2572 status_manager.go:895] "Failed to get status for pod" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" err="pods \"must-gather-4fsmt\" is forbidden: User \"system:node:ip-10-0-135-36.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-2bv9h\": no relationship found between node 'ip-10-0-135-36.ec2.internal' and this object" Apr 22 18:08:41.045774 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.045658 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57b6v\" (UniqueName: \"kubernetes.io/projected/b7cd397d-66df-4554-97ab-826f049f3fcc-kube-api-access-57b6v\") pod \"b7cd397d-66df-4554-97ab-826f049f3fcc\" (UID: \"b7cd397d-66df-4554-97ab-826f049f3fcc\") " Apr 22 18:08:41.045774 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.045747 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7cd397d-66df-4554-97ab-826f049f3fcc-must-gather-output\") pod \"b7cd397d-66df-4554-97ab-826f049f3fcc\" (UID: \"b7cd397d-66df-4554-97ab-826f049f3fcc\") " Apr 22 18:08:41.054558 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.054515 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7cd397d-66df-4554-97ab-826f049f3fcc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b7cd397d-66df-4554-97ab-826f049f3fcc" (UID: "b7cd397d-66df-4554-97ab-826f049f3fcc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:41.057180 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.057139 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7cd397d-66df-4554-97ab-826f049f3fcc-kube-api-access-57b6v" (OuterVolumeSpecName: "kube-api-access-57b6v") pod "b7cd397d-66df-4554-97ab-826f049f3fcc" (UID: "b7cd397d-66df-4554-97ab-826f049f3fcc"). InnerVolumeSpecName "kube-api-access-57b6v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:41.101295 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.101255 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" path="/var/lib/kubelet/pods/b7cd397d-66df-4554-97ab-826f049f3fcc/volumes" Apr 22 18:08:41.147650 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.147604 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57b6v\" (UniqueName: \"kubernetes.io/projected/b7cd397d-66df-4554-97ab-826f049f3fcc-kube-api-access-57b6v\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:08:41.147650 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.147650 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b7cd397d-66df-4554-97ab-826f049f3fcc-must-gather-output\") on node \"ip-10-0-135-36.ec2.internal\" DevicePath \"\"" Apr 22 18:08:41.543961 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.543926 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2bv9h_must-gather-4fsmt_b7cd397d-66df-4554-97ab-826f049f3fcc/copy/0.log" Apr 22 18:08:41.544441 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.544324 2572 generic.go:358] "Generic (PLEG): container finished" podID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerID="19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6" exitCode=143 Apr 22 18:08:41.544510 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.544450 2572 scope.go:117] "RemoveContainer" containerID="19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6" Apr 22 18:08:41.544615 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.544597 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bv9h/must-gather-4fsmt" Apr 22 18:08:41.558555 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.558524 2572 scope.go:117] "RemoveContainer" containerID="7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb" Apr 22 18:08:41.589454 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.589332 2572 scope.go:117] "RemoveContainer" containerID="19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6" Apr 22 18:08:41.591729 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:08:41.589772 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6\": container with ID starting with 19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6 not found: ID does not exist" containerID="19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6" Apr 22 18:08:41.591729 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.589815 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6"} err="failed to get container status \"19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6\": rpc error: code = NotFound desc = could not find container \"19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6\": container with ID starting with 19fc10923aca451767112093732b464749f97dca3a5ed724de425f002fd77fe6 not found: ID does not exist" Apr 22 18:08:41.591729 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.589842 2572 scope.go:117] "RemoveContainer" containerID="7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb" Apr 22 18:08:41.591729 ip-10-0-135-36 kubenswrapper[2572]: E0422 18:08:41.590148 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb\": container with ID starting with 7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb not found: ID does not exist" containerID="7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb" Apr 22 18:08:41.591729 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:41.590180 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb"} err="failed to get container status \"7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb\": rpc error: code = NotFound desc = could not find container \"7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb\": container with ID starting with 7289d67ad0317d0da1a4f7f06c62dd6b428d9165b8beb6ee1607bbfdd9582bfb not found: ID does not exist" Apr 22 18:08:43.398811 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:43.398780 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-drzkg_08478191-02d1-4e5b-bb2b-84c1fa06481d/limitador/0.log" Apr 22 18:08:44.513542 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:44.513508 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_91a6df17-c871-4536-9033-6a7971a37d9f/alertmanager/0.log" Apr 22 18:08:44.544587 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:44.544557 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_91a6df17-c871-4536-9033-6a7971a37d9f/config-reloader/0.log" Apr 22 18:08:44.575143 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:44.575114 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_91a6df17-c871-4536-9033-6a7971a37d9f/kube-rbac-proxy-web/0.log" Apr 22 18:08:44.598499 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:44.598474 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_91a6df17-c871-4536-9033-6a7971a37d9f/kube-rbac-proxy/0.log" Apr 22 18:08:44.617383 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:44.617345 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_91a6df17-c871-4536-9033-6a7971a37d9f/kube-rbac-proxy-metric/0.log" Apr 22 18:08:44.636456 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:44.636427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_91a6df17-c871-4536-9033-6a7971a37d9f/prom-label-proxy/0.log" Apr 22 18:08:44.659085 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:44.659042 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_91a6df17-c871-4536-9033-6a7971a37d9f/init-config-reloader/0.log" Apr 22 18:08:44.705818 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:44.705728 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-knbfq_7676326c-368f-406c-a545-55cebded1f2b/cluster-monitoring-operator/0.log" Apr 22 18:08:45.004746 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.004654 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r7j8x_b8d65af8-cf50-4937-a8f0-ee24c1820476/node-exporter/0.log" Apr 22 18:08:45.039820 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.039785 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r7j8x_b8d65af8-cf50-4937-a8f0-ee24c1820476/kube-rbac-proxy/0.log" Apr 22 18:08:45.081677 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.081645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r7j8x_b8d65af8-cf50-4937-a8f0-ee24c1820476/init-textfile/0.log" Apr 22 18:08:45.124753 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.124715 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vgpzl_843754df-5fe6-43b3-9b50-0b7e518d1c36/kube-rbac-proxy-main/0.log" Apr 22 18:08:45.159771 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.159744 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vgpzl_843754df-5fe6-43b3-9b50-0b7e518d1c36/kube-rbac-proxy-self/0.log" Apr 22 18:08:45.198436 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.198365 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vgpzl_843754df-5fe6-43b3-9b50-0b7e518d1c36/openshift-state-metrics/0.log" Apr 22 18:08:45.553961 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.553914 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7d67bb5c78-h5pqt_79bfd8d1-2166-4874-9a74-b05e77923eae/telemeter-client/0.log" Apr 22 18:08:45.592484 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.592458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7d67bb5c78-h5pqt_79bfd8d1-2166-4874-9a74-b05e77923eae/reload/0.log" Apr 22 18:08:45.634713 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.634599 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7d67bb5c78-h5pqt_79bfd8d1-2166-4874-9a74-b05e77923eae/kube-rbac-proxy/0.log" Apr 22 18:08:45.682969 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.682932 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7969c5c58b-f8f2b_d32d0e48-df72-4243-81b9-ba8cf37c6bb6/thanos-query/0.log" Apr 22 18:08:45.721534 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.721496 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7969c5c58b-f8f2b_d32d0e48-df72-4243-81b9-ba8cf37c6bb6/kube-rbac-proxy-web/0.log" Apr 22 18:08:45.760573 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.760542 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7969c5c58b-f8f2b_d32d0e48-df72-4243-81b9-ba8cf37c6bb6/kube-rbac-proxy/0.log" Apr 22 18:08:45.801555 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.801528 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7969c5c58b-f8f2b_d32d0e48-df72-4243-81b9-ba8cf37c6bb6/prom-label-proxy/0.log" Apr 22 18:08:45.826087 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.826011 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7969c5c58b-f8f2b_d32d0e48-df72-4243-81b9-ba8cf37c6bb6/kube-rbac-proxy-rules/0.log" Apr 22 18:08:45.846439 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:45.846407 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7969c5c58b-f8f2b_d32d0e48-df72-4243-81b9-ba8cf37c6bb6/kube-rbac-proxy-metrics/0.log" Apr 22 18:08:47.350843 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.350807 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/2.log" Apr 22 18:08:47.359241 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.359213 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gpsv4_31b62473-ecf9-4e50-926e-50d8d8ca9231/console-operator/3.log" Apr 22 18:08:47.405887 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.405852 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7"] Apr 22 18:08:47.406665 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.406639 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerName="copy" Apr 22 18:08:47.406814 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.406681 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerName="copy" Apr 22 18:08:47.406814 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.406753 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerName="gather" Apr 22 18:08:47.406814 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.406763 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerName="gather" Apr 22 18:08:47.407270 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.406877 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerName="gather" Apr 22 18:08:47.407270 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.406889 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7cd397d-66df-4554-97ab-826f049f3fcc" containerName="copy" Apr 22 18:08:47.413710 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.413666 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.419854 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.419829 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7"] Apr 22 18:08:47.511981 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.511944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2ggg\" (UniqueName: \"kubernetes.io/projected/8056141d-0e72-4e4d-821d-3207f67409e9-kube-api-access-m2ggg\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.512153 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.511998 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-lib-modules\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.512153 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.512045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-proc\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.512153 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.512098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-podres\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.512153 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.512124 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-sys\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613396 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-podres\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613396 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-sys\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613627 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2ggg\" (UniqueName: \"kubernetes.io/projected/8056141d-0e72-4e4d-821d-3207f67409e9-kube-api-access-m2ggg\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613627 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-podres\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613627 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-sys\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613627 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-lib-modules\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613627 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613557 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-proc\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613627 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-lib-modules\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.613975 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.613650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8056141d-0e72-4e4d-821d-3207f67409e9-proc\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.622914 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.621589 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2ggg\" (UniqueName: \"kubernetes.io/projected/8056141d-0e72-4e4d-821d-3207f67409e9-kube-api-access-m2ggg\") pod \"perf-node-gather-daemonset-mgdw7\" (UID: \"8056141d-0e72-4e4d-821d-3207f67409e9\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.730713 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.730657 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:47.840848 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.840805 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59c5574b9d-hkr9q_5d4de06e-f975-47e8-9a8e-9cfa268d7968/console/0.log" Apr 22 18:08:47.888408 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.888323 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-74lqm_df068068-e066-4a5c-86c3-8ad5f03f5f19/download-server/0.log" Apr 22 18:08:47.913545 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:47.913513 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7"] Apr 22 18:08:47.915398 ip-10-0-135-36 kubenswrapper[2572]: W0422 18:08:47.915368 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8056141d_0e72_4e4d_821d_3207f67409e9.slice/crio-4e6a12e3be3c5f8481688596fbb0c58b5c1ed27f32e44c93aa0bf5300bf44f33 WatchSource:0}: Error finding container 4e6a12e3be3c5f8481688596fbb0c58b5c1ed27f32e44c93aa0bf5300bf44f33: Status 404 returned error can't find the container with id 4e6a12e3be3c5f8481688596fbb0c58b5c1ed27f32e44c93aa0bf5300bf44f33 Apr 22 18:08:48.597046 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:48.597009 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" event={"ID":"8056141d-0e72-4e4d-821d-3207f67409e9","Type":"ContainerStarted","Data":"6da633349fdfc80186f78198f59a50e54518cdddd73ffd98ab97da03b9cda3f5"} Apr 22 18:08:48.597046 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:48.597048 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" event={"ID":"8056141d-0e72-4e4d-821d-3207f67409e9","Type":"ContainerStarted","Data":"4e6a12e3be3c5f8481688596fbb0c58b5c1ed27f32e44c93aa0bf5300bf44f33"} Apr 22 18:08:48.597548 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:48.597076 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:48.614469 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:48.614420 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" podStartSLOduration=1.614405565 podStartE2EDuration="1.614405565s" podCreationTimestamp="2026-04-22 18:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:08:48.610780153 +0000 UTC m=+2062.090036837" watchObservedRunningTime="2026-04-22 18:08:48.614405565 +0000 UTC m=+2062.093662657" Apr 22 18:08:49.083729 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:49.083670 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m45f5_8da954ec-226e-4e6e-a43d-0ef4bc182e6c/dns/0.log" Apr 22 18:08:49.101855 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:49.101830 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m45f5_8da954ec-226e-4e6e-a43d-0ef4bc182e6c/kube-rbac-proxy/0.log" Apr 22 18:08:49.122293 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:49.122263 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4nmbk_0e0b96d6-6ddc-4b33-8123-2eec86f21a66/dns-node-resolver/0.log" Apr 22 18:08:49.607446 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:49.607421 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7q8mm_2f76cec7-7e9b-4a76-a5c5-c12f9790bb38/node-ca/0.log" Apr 22 18:08:50.914209 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:50.914182 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-66csc_7c3b2116-a369-4692-8afd-099a5a5a39cd/serve-healthcheck-canary/0.log" Apr 22 18:08:51.384896 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:51.384800 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-k6554_611abd7b-ca37-43e2-a480-f0fc4b1aa306/insights-operator/1.log" Apr 22 18:08:51.439756 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:51.439726 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-k6554_611abd7b-ca37-43e2-a480-f0fc4b1aa306/insights-operator/0.log" Apr 22 18:08:51.588676 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:51.588646 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xq75n_5dab7bfc-8f04-4d6d-816b-38e69e040297/kube-rbac-proxy/0.log" Apr 22 18:08:51.606487 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:51.606462 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xq75n_5dab7bfc-8f04-4d6d-816b-38e69e040297/exporter/0.log" Apr 22 18:08:51.626799 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:51.626767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xq75n_5dab7bfc-8f04-4d6d-816b-38e69e040297/extractor/0.log" Apr 22 18:08:54.101640 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:54.101608 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-959c974c-qb4dq_59ab3137-77a7-4d89-9a7d-31b1045686f1/manager/0.log" Apr 22 18:08:54.122983 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:54.122960 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-5dz78_64fc1c68-9769-4b9c-b1c9-fcd2f3fe8002/openshift-lws-operator/0.log" Apr 22 18:08:54.612314 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:54.612288 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-mgdw7" Apr 22 18:08:54.914994 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:54.914967 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-hltgs_ee49a464-d38e-4463-ba52-0ab26dafd3c7/s3-init/0.log" Apr 22 18:08:59.476436 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:59.476345 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-r4g8x_306bad3b-30a1-40bf-9575-15a852186090/migrator/0.log" Apr 22 18:08:59.496961 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:08:59.496930 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-r4g8x_306bad3b-30a1-40bf-9575-15a852186090/graceful-termination/0.log" Apr 22 18:09:00.765392 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:00.765362 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5drmj_aedd63e3-6ddf-4202-adc9-e73988dd4d87/kube-multus-additional-cni-plugins/0.log" Apr 22 18:09:00.786994 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:00.786970 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5drmj_aedd63e3-6ddf-4202-adc9-e73988dd4d87/egress-router-binary-copy/0.log" Apr 22 18:09:00.805216 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:00.805191 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5drmj_aedd63e3-6ddf-4202-adc9-e73988dd4d87/cni-plugins/0.log" Apr 22 18:09:00.831041 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:00.831015 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5drmj_aedd63e3-6ddf-4202-adc9-e73988dd4d87/bond-cni-plugin/0.log" Apr 22 18:09:00.849889 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:00.849869 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5drmj_aedd63e3-6ddf-4202-adc9-e73988dd4d87/routeoverride-cni/0.log" Apr 22 18:09:00.870349 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:00.870330 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5drmj_aedd63e3-6ddf-4202-adc9-e73988dd4d87/whereabouts-cni-bincopy/0.log" Apr 22 18:09:00.890881 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:00.890859 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5drmj_aedd63e3-6ddf-4202-adc9-e73988dd4d87/whereabouts-cni/0.log" Apr 22 18:09:01.302011 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:01.301985 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q2vht_de5c6256-7a62-4226-ba85-b1cfcfd4d404/kube-multus/0.log" Apr 22 18:09:01.394758 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:01.394728 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-djttm_34c7625b-b71f-4d8d-a883-c465098dbba7/network-metrics-daemon/0.log" Apr 22 18:09:01.409041 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:01.408995 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-djttm_34c7625b-b71f-4d8d-a883-c465098dbba7/kube-rbac-proxy/0.log" Apr 22 18:09:02.481802 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:02.481764 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hrsxq_a719b6ff-4e34-4393-bec2-9239979501ec/ovn-controller/0.log" Apr 22 18:09:02.515337 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:02.515300 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hrsxq_a719b6ff-4e34-4393-bec2-9239979501ec/ovn-acl-logging/0.log" Apr 22 18:09:02.535119 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:02.535090 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hrsxq_a719b6ff-4e34-4393-bec2-9239979501ec/kube-rbac-proxy-node/0.log" Apr 22 18:09:02.555015 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:02.554985 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hrsxq_a719b6ff-4e34-4393-bec2-9239979501ec/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:09:02.580323 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:02.580289 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hrsxq_a719b6ff-4e34-4393-bec2-9239979501ec/northd/0.log" Apr 22 18:09:02.602165 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:02.602141 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hrsxq_a719b6ff-4e34-4393-bec2-9239979501ec/nbdb/0.log" Apr 22 18:09:02.622907 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:02.622878 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hrsxq_a719b6ff-4e34-4393-bec2-9239979501ec/sbdb/0.log" Apr 22 18:09:02.810687 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:02.810559 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hrsxq_a719b6ff-4e34-4393-bec2-9239979501ec/ovnkube-controller/0.log" Apr 22 18:09:04.104576 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:04.104547 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-pwhxf_f3b152e4-0668-4dd6-9f03-c910a1ce0561/check-endpoints/0.log" Apr 22 18:09:04.171490 ip-10-0-135-36 kubenswrapper[2572]: I0422 18:09:04.171460 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-g829p_27c6ba53-cee0-478e-afff-6fec5a07bc6f/network-check-target-container/0.log"